Skip to main content

Home/ History Readings/ Group items tagged quality of life

Rss Feed Group items tagged

Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

12 Rules for Life: An Antidote to Chaos (Jordan B. Peterson) - 0 views

  • RULES? MORE RULES? REALLY? Isn’t life complicated enough, restricting enough, without abstract rules that don’t take our unique, individual situations into account? And given that our brains are plastic, and all develop differently based on our life experiences, why even expect that a few rules might be helpful to us all?
  • “I’ve got some good news…and I’ve got some bad news,” the lawgiver yells to them. “Which do you want first?” “The good news!” the hedonists reply. “I got Him from fifteen commandments down to ten!” “Hallelujah!” cries the unruly crowd. “And the bad?” “Adultery is still in.”
  • Maps of Meaning was sparked by Jordan’s agonized awareness, as a teenager growing up in the midst of the Cold War, that much of mankind seemed on the verge of blowing up the planet to defend their various identities. He felt he had to understand how it could be that people would sacrifice everything for an “identity,”
  • ...297 more annotations...
  • the story of the golden calf also reminds us that without rules we quickly become slaves to our passions—and there’s nothing freeing about that.
  • And the story suggests something more: unchaperoned, and left to our own untutored judgment, we are quick to aim low and worship qualities that are beneath us—in this case, an artificial animal that brings out our own animal instincts in a completely unregulated way.
  • Similarly, in this book Professor Peterson doesn’t just propose his twelve rules, he tells stories, too, bringing to bear his knowledge of many fields as he illustrates and explains why the best rules do not ultimately restrict us but instead facilitate our goals and make for fuller, freer lives.
  • Peterson wasn’t really an “eccentric”; he had sufficient conventional chops, had been a Harvard professor, was a gentleman (as cowboys can be) though he did say damn and bloody a lot, in a rural 1950s sort of way. But everyone listened, with fascination on their faces, because he was in fact addressing questions of concern to everyone at the table.
  • unlike many academics who take the floor and hold it, if someone challenged or corrected him he really seemed to like it. He didn’t rear up and neigh. He’d say, in a kind of folksy way, “Yeah,” and bow his head involuntarily, wag it if he had overlooked something, laughing at himself for overgeneralizing. He appreciated being shown another side of an issue, and it became clear that thinking through a problem was, for him, a dialogic process.
  • for an egghead Peterson was extremely practical. His examples were filled with applications to everyday life: business management, how to make furniture (he made much of his own), designing a simple house, making a room beautiful (now an internet meme) or in another, specific case related to education, creating an online writing project that kept minority students from dropping out of school by getting them to do a kind of psychoanalytic exercise on themselves,
  • These Westerners were different: self-made, unentitled, hands on, neighbourly and less precious than many of their big-city peers, who increasingly spend their lives indoors, manipulating symbols on computers. This cowboy psychologist seemed to care about a thought only if it might, in some way, be helpful to someone.
  • I was drawn to him because here was a clinician who also had given himself a great books education, and who not only loved soulful Russian novels, philosophy and ancient mythology, but who also seemed to treat them as his most treasured inheritance. But he also did illuminating statistical research on personality and temperament, and had studied neuroscience. Though trained as a behaviourist, he was powerfully drawn to psychoanalysis with its focus on dreams, archetypes, the persistence of childhood conflicts in the adult, and the role of defences and rationalization in everyday life. He was also an outlier in being the only member of the research-oriented Department of Psychology at the University of Toronto who also kept a clinical practice.
  • Maps of Meaning, published nearly two decades ago, shows Jordan’s wide-ranging approach to understanding how human beings and the human brain deal with the archetypal situation that arises whenever we, in our daily lives, must face something we do not understand.
  • The brilliance of the book is in his demonstration of how rooted this situation is in evolution, our DNA, our brains and our most ancient stories. And he shows that these stories have survived because they still provide guidance in dealing with uncertainty, and the unavoidable unknown.
  • this is why many of the rules in this book, being based on Maps of Meaning, have an element of universality to them.
  • We are ambivalent about rules, even when we know they are good for us. If we are spirited souls, if we have character, rules seem restrictive, an affront to our sense of agency and our pride in working out our own lives. Why should we be judged according to another’s rule?
  • And he felt he had to understand the ideologies that drove totalitarian regimes to a variant of that same behaviour: killing their own citizens.
  • Ideologies are simple ideas, disguised as science or philosophy, that purport to explain the complexity of the world and offer remedies that will perfect it.
  • Ideologues are people who pretend they know how to “make the world a better place” before they’ve taken care of their own chaos within.
  • Ideologies are substitutes for true knowledge, and ideologues are always dangerous when they come to power, because a simple-minded I-know-it-all approach is no match for the complexity of existence.
  • To understand ideology, Jordan read extensively about not only the Soviet gulag, but also the Holocaust and the rise of Nazism. I had never before met a person, born Christian and of my generation, who was so utterly tormented by what happened in Europe to the Jews, and who had worked so hard to understand how it could have occurred.
  • I saw what now millions have seen online: a brilliant, often dazzling public speaker who was at his best riffing like a jazz artist; at times he resembled an ardent Prairie preacher (not in evangelizing, but in his passion, in his ability to tell stories that convey the life-stakes that go with believing or disbelieving various ideas). Then he’d just as easily switch to do a breathtakingly systematic summary of a series of scientific studies. He was a master at helping students become more reflective, and take themselves and their futures seriously. He taught them to respect many of the greatest books ever written. He gave vivid examples from clinical practice, was (appropriately) self-revealing, even of his own vulnerabilities, and made fascinating links between evolution, the brain and religious stories.
  • Above all, he alerted his students to topics rarely discussed in university, such as the simple fact that all the ancients, from Buddha to the biblical authors, knew what every slightly worn-out adult knows, that life is suffering.
  • chances are, if you or someone you love is not suffering now, they will be within five years, unless you are freakishly lucky. Rearing kids is hard, work is hard, aging, sickness and death are hard, and Jordan emphasized that doing all that totally on your own, without the benefit of a loving relationship, or wisdom, or the psychological insights of the greatest psychologists, only makes it harder.
  • focused on triumphant heroes. In all these triumph stories, the hero has to go into the unknown, into an unexplored territory, and deal with a new great challenge and take great risks. In the process, something of himself has to die, or be given up, so he can be reborn and meet the challenge. This requires courage, something rarely discussed in a psychology class or textbook.
  • Jordan
  • views of his first YouTube statements quickly numbered in the hundreds of thousands. But people have kept listening because what he is saying meets a deep and unarticulated need. And that is because alongside our wish to be free of rules, we all search for structure.
  • the first generation to have been so thoroughly taught two seemingly contradictory ideas about morality, simultaneously—at their schools, colleges and universities, by many in my own generation. This contradiction has left them at times disoriented and uncertain, without guidance and, more tragically, deprived of riches they don’t even know exist.
  • morality and the rules associated with it are just a matter of personal opinion or happenstance, “relative to” or “related to” a particular framework, such as one’s ethnicity, one’s upbringing, or the culture or historical…
  • The first idea or teaching is that morality is relative, at best a…
  • So, the decent thing to do—once it becomes apparent how arbitrary your, and your society’s, “moral values” are—is to show tolerance for people who think differently, and…
  • for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an adult can…
  • That emphasis on tolerance is so paramount that for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an…
  • And so a generation has been raised untutored in what was once called, aptly, “practical wisdom,” which guided previous generations. Millennials, often told they have received the finest education available anywhere, have actually…
  • professors, chose to devalue thousands of years of human knowledge about how to acquire virtue, dismissing it as passé, “…
  • They were so successful at it that the very word “virtue” sounds out of date, and someone using it appears…
  • The study of virtue is not quite the same as the study of morals (right and wrong, good and evil). Aristotle defined the virtues simply as the ways of behaving that are most conducive to happiness in life. Vice was…
  • Cultivating judgment about the difference between virtue and vice is the beginning of wisdom, something…
  • By contrast, our modern relativism begins by asserting that making judgments about how to live is impossible, because there is no real good, and no…
  • Thus relativism’s closest approximation to “virtue” is “tolerance.” Only tolerance will provide social cohesion between different groups, and save us from harming each other. On Facebook and other forms of social media, therefore, you signal your so-called…
  • Intolerance of others’ views (no matter how ignorant or incoherent they may be) is not simply wrong; in a world where there is no right or wrong, it is worse: it is a sign you are…
  • But it turns out that many people cannot tolerate the vacuum—the chaos—which is inherent in life, but made worse by this moral relativism; they cannot live without a moral compass,…
  • So, right alongside relativism, we find the spread of nihilism and despair, and also the opposite of moral relativism: the blind certainty offered by ideologies…
  • Dr. Norman Doidge, MD, is the author of The Brain That Changes Itself
  • so we arrive at the second teaching that millennials have been bombarded with. They sign up for a humanities course, to study the greatest books ever written. But they’re not assigned the books; instead they are given…
  • (But the idea that we can easily separate facts and values was and remains naive; to some extent, one’s values determine what one will pay…
  • For the ancients, the discovery that different people have different ideas about how, practically, to live, did not paralyze them; it deepened their understanding of humanity and led to some of the most satisfying conversations human beings have ever had, about how life might be lived.
  • Modern moral relativism has many sources. As we in the West learned more history, we understood that different epochs had different moral codes. As we travelled the seas and explored the globe, we learned of far-flung tribes on different continents whose different moral codes made sense relative to, or within the framework of, their societies. Science played a role, too, by attacking the religious view of the world, and thus undermining the religious grounds for ethics and rules. Materialist social science implied that we could divide the world into facts (which all could observe, and were objective and “real”) and values (…
  • it seems that all human beings are, by some kind of biological endowment, so ineradicably concerned with morality that we create a structure of laws and rules wherever we are. The idea that human life can be free of moral concerns is a fantasy.
  • given that we are moral animals, what must be the effect of our simplistic modern relativism upon us? It means we are hobbling ourselves by pretending to be something we are not. It is a mask, but a strange one, for it mostly deceives the one who wears it.
  • Far better to integrate the best of what we are now learning with the books human beings saw fit to preserve over millennia, and with the stories that have survived, against all odds, time’s tendency to obliterate.
  • these really are rules. And the foremost rule is that you must take responsibility for your own life. Period.
  • Jordan’s message that each individual has ultimate responsibility to bear; that if one wants to live a full life, one first sets one’s own house in order; and only then can one sensibly aim to take on bigger responsibilities.
  • if it’s uncertain that our ideals are attainable, why do we bother reaching in the first place? Because if you don’t reach for them, it is certain you will never feel that your life has meaning.
  • And perhaps because, as unfamiliar and strange as it sounds, in the deepest part of our psyche, we all want to be judged.
  • Instead of despairing about these differences in moral codes, Aristotle argued that though specific rules, laws and customs differed from place to place, what does not differ is that in all places human beings, by their nature, have a proclivity to make rules, laws and customs.
  • Freud never argued (as do some who want all culture to become one huge group therapy session) that one can live one’s entire life without ever making judgments, or without morality. In fact, his point in Civilization and Its Discontents is that civilization only arises when some restraining rules and morality are in place.
  • Aleksandr Solzhenitsyn, the great documenter of the slave-labour-camp horrors of the latter, once wrote that the “pitiful ideology” holding that “human beings are created for happiness” was an ideology “done in by the first blow of the work assigner’s cudgel.”1 In a crisis, the inevitable suffering that life entails can rapidly make a mockery of the idea that happiness is the proper pursuit of the individual. On the radio show, I suggested, instead, that a deeper meaning was required. I noted that the nature of such meaning was constantly re-presented in the great stories of the past, and that it had more to do with developing character in the face of suffering than with happiness.
  • I proposed in Maps of Meaning that the great myths and religious stories of the past, particularly those derived from an earlier, oral tradition, were moral in their intent, rather than descriptive. Thus, they did not concern themselves with what the world was, as a scientist might have it, but with how a human being should act.
  • I suggested that our ancestors portrayed the world as a stage—a drama—instead of a place of objects. I described how I had come
  • to believe that the constituent elements of the world as drama were order and chaos, and not material things.
  • Order is where the people around you act according to well-understood social norms, and remain predictable and cooperative. It’s the world of social structure, explored territory, and familiarity. The state of Order is typically portrayed, symbolically—imaginatively—as masculine.
  • Chaos, by contrast, is where—or when—something unexpected happens.
  • As the antithesis of symbolically masculine order, it’s presented imaginatively as feminine. It’s the new and unpredictable suddenly emerging in the midst of the commonplace familiar. It’s Creation and Destruction,
  • Order is the white, masculine serpent; Chaos, its black, feminine counterpart. The black dot in the white—and the white in the black—indicate the possibility of transformation: just when things seem secure, the unknown can loom, unexpectedly and large. Conversely, just when everything seems lost, new order can emerge from catastrophe and chaos.
  • For the Taoists, meaning is to be found on the border between the ever-entwined pair. To walk that border is to stay on the path of life, the divine Way. And that’s much better than happiness.
  • trying to address a perplexing problem: the reason or reasons for the nuclear standoff of the Cold War. I couldn’t understand how belief systems could be so important to people that they were willing to risk the destruction of the world to protect them. I came to realize that shared belief systems made people intelligible to one another—and that the systems weren’t just about belief.
  • People who live by the same code are rendered mutually predictable to one another. They act in keeping with each other’s expectations and desires. They can cooperate. They can even compete peacefully, because everyone knows what to expect from everyone else.
  • Shared beliefs simplify the world, as well, because people who know what to expect from one another can act together to tame the world. There is perhaps nothing more important than the maintenance of this organization—this simplification. If it’s threatened, the great ship of state rocks.
  • It isn’t precisely that people will fight for what they believe. They will fight, instead, to maintain the match between what they believe, what they expect, and what they desire. They will fight to maintain the match between what they expect and how everyone is acting. It is precisely the maintenance of that match that enables everyone
  • There’s more to it, too. A shared cultural system stabilizes human interaction, but is also a system of value—a hierarchy of value, where some things are given priority and importance and others are not. In the absence of such a system of value, people simply cannot act. In fact, they can’t even perceive, because both action and perception require a goal, and a valid goal is, by necessity, something valued.
  • We experience much of our positive emotion in relation to goals. We are not happy, technically speaking, unless we see ourselves progressing—and the very idea of progression implies value.
  • Worse yet is the fact that the meaning of life without positive value is not simply neutral. Because we are vulnerable and mortal, pain and anxiety are an integral part of human existence. We must have something to set against the suffering that is intrinsic to Being.*2 We must have the meaning inherent in a profound system of value or the horror of existence rapidly becomes paramount. Then, nihilism beckons, with its hopelessness and despair.
  • So: no value, no meaning. Between value systems, however, there is the possibility of conflict. We are thus eternally caught between the most diamantine rock and the hardest of places:
  • loss of group-centred belief renders life chaotic, miserable, intolerable; presence of group-centred belief makes conflict with other groups inevitable.
  • In the West, we have been withdrawing from our tradition-, religion- and even nation-centred cultures, partly to decrease the danger of group conflict. But we are increasingly falling prey to the desperation of meaninglessness, and that is no improvement at all.
  • While writing Maps of Meaning, I was (also) driven by the realization that we can no longer afford conflict—certainly not on the scale of the world conflagrations of the twentieth century.
  • I came to a more complete, personal realization of what the great stories of the past continually insist upon: the centre is occupied by the individual.
  • It is possible to transcend slavish adherence to the group and its doctrines and, simultaneously, to avoid the pitfalls of its opposite extreme, nihilism. It is possible, instead, to find sufficient meaning in individual consciousness and experience.
  • How could the world be freed from the terrible dilemma of conflict, on the one hand, and psychological and social dissolution, on the other? The answer was this: through the elevation and development of the individual, and through the willingness of everyone to shoulder the burden of Being and to take the heroic path. We must each adopt as much responsibility as possible for individual life, society and the world.
  • We must each tell the truth and repair what is in disrepair and break down and recreate what is old and outdated. It is in this manner that we can and must reduce the suffering that poisons the world. It’s asking a lot. It’s asking for everything.
  • the alternative—the horror of authoritarian belief, the chaos of the collapsed state, the tragic catastrophe of the unbridled natural world, the existential angst and weakness of the purposeless
  • individual—is clearly worse.
  • a title: 12 Rules for Life: An Antidote to Chaos. Why did that one rise up above all others? First and foremost, because of its simplicity. It indicates clearly that people need ordering principles, and that chaos otherwise beckons.
  • We require rules, standards, values—alone and together. We’re pack animals, beasts of burden. We must bear a load, to justify our miserable existence. We require routine and tradition. That’s order. Order can become excessive, and that’s not good, but chaos can swamp us, so we drown—and that is also not good. We need to stay on the straight and narrow path.
  • I hope that these rules and their accompanying essays will help people understand what they already know: that the soul of the individual eternally hungers for the heroism of genuine Being, and that the willingness to take on that responsibility is identical to the decision to live a meaningful life.
  • RULE 1   STAND UP STRAIGHT WITH YOUR SHOULDERS BACK
  • Because territory matters, and because the best locales are always in short supply, territory-seeking among animals produces conflict. Conflict, in turn, produces another problem: how to win or lose without the disagreeing parties incurring too great a cost.
  • It’s winner-take-all in the lobster world, just as it is in human societies, where the top 1 percent have as much loot as the bottom 50 percent11—and where the richest eighty-five people have as much as the bottom three and a half billion.
  • This principle is sometimes known as Price’s law, after Derek J. de Solla Price,13 the researcher who discovered its application in science in 1963. It can be modelled using an approximately L-shaped graph, with number of people on the vertical axis, and productivity or resources on the horizontal.
  • Instead of undertaking the computationally difficult task of identifying the best man, the females outsource the problem to the machine-like calculations of the dominance hierarchy. They let the males fight it out and peel their paramours from the top.
  • The dominant male, with his upright and confident posture, not only gets the prime real estate and easiest access to the best hunting grounds. He also gets all the girls. It is exponentially more worthwhile to be successful, if you are a lobster, and male.
  • dominance hierarchies have been an essentially permanent feature of the environment to which all complex life has adapted. A third of a billion years ago, brains and nervous systems were comparatively simple. Nonetheless, they already had the structure and neurochemistry necessary to process information about status and society. The importance of this fact can hardly be overstated.
  • evolution works, in large part, through variation and natural selection. Variation exists for many reasons, including gene-shuffling (to put it simply) and random mutation. Individuals vary within a species for such reasons. Nature chooses from among them, across time. That theory, as stated, appears to account for the continual alteration of life-forms over the eons.
  • But there’s an additional question lurking under the surface: what exactly is the “nature” in “natural selection”? What exactly is “the environment” to which animals adapt?
  • Nature “selects.” The idea of selects contains implicitly nested within it the idea of fitness. It is “fitness” that is “selected.” Fitness, roughly speaking, is the probability that a given organism will leave offspring (will propagate its genes through time). The “fit” in “fitness” is therefore the matching of organismal attribute to environmental demand.
  • But nature, the selecting agent, is not a static selector—not in any simple sense.
  • As the environment supporting a species transforms and changes, the features that make a given individual successful in surviving and reproducing also transform and change. Thus, the theory of natural selection does not posit creatures matching themselves ever more precisely to a template specified by the world. It is more that creatures are in a dance with nature, albeit one that is deadly.
  • Nature is not simply dynamic, either. Some things change quickly, but they are nested within other things that change less quickly (music
  • It’s chaos, within order, within chaos, within higher order. The order that is most real is the order that is most unchanging—and that is not necessarily the order that is most easily seen. The leaf, when perceived, might blind the observer to the tree. The tree can blind him to the forest.
  • It is also a mistake to conceptualize nature romantically.
  • Unfortunately, “the environment” is also elephantiasis and guinea worms (don’t ask), anopheles mosquitoes and malaria, starvation-level droughts, AIDS and the Black Plague.
  • It is because of the existence of such things, of course, that we attempt to modify our surroundings, protecting our children, building cities and transportation systems and growing food and generating power.
  • this brings us to a third erroneous concept: that nature is something strictly segregated from the cultural constructs that have emerged within it.
  • It does not matter whether that feature is physical and biological, or social and cultural. All that matters, from a Darwinian perspective, is permanence—and the dominance hierarchy, however social or cultural it might appear, has been around for some half a billion years.
  • The dominance hierarchy is not capitalism. It’s not communism, either, for that matter. It’s not the military-industrial complex. It’s not the patriarchy—that disposable, malleable, arbitrary cultural artefact. It’s not even a human creation; not in the most profound sense. It is instead a near-eternal aspect of the environment, and much of what is blamed on these more ephemeral manifestations is a consequence of its unchanging existence.
  • We were struggling for position before we had skin, or hands, or lungs, or bones. There is little more natural than culture. Dominance hierarchies are older than trees.
  • The part of our brain that keeps track of our position in the dominance hierarchy is therefore exceptionally ancient and fundamental.17 It is a master control system, modulating our perceptions, values, emotions, thoughts and actions. It powerfully affects every aspect of our Being, conscious and unconscious alike.
  • The ancient part of your brain specialized for assessing dominance watches how you are treated by other people. On that evidence, it renders a determination of your value and assigns you a status. If you are judged by your peers as of little worth, the counter restricts serotonin availability. That makes you much more physically and psychologically reactive to any circumstance or event that might produce emotion, particularly if it is negative. You need that reactivity. Emergencies are common at the bottom, and you must be ready to survive. Unfortunately, that physical hyper-response, that constant alertness, burns up a lot of precious energy and physical resources.
  • It will leave you far more likely to live, or die, carelessly, for a rare opportunity at pleasure, when it manifests itself. The physical demands of emergency preparedness will wear you down in every way.21
  • If you have a high status, on the other hand, the counter’s cold, pre-reptilian mechanics assume that your niche is secure, productive
  • You can delay gratification, without forgoing it forever. You can afford to be a reliable and thoughtful citizen.
  • Sometimes, however, the counter mechanism can go wrong. Erratic habits of sleeping and eating can interfere with its function. Uncertainty can throw it for a loop. The body, with its various parts,
  • needs
  • to function like a well-rehearsed orchestra. Every system must play its role properly, and at exactly the right time, or noise and chaos ensue. It is for this reason that routine is so necessary. The acts of life we repeat every day need to be automatized. They must be turned into stable and reliable habits, so they lose their complexity and gain predictability and simplicity.
  • It is for such reasons that I always ask my clinical clients first about sleep. Do they wake up in the morning at approximately the time the typical person wakes up, and at the same time every day?
  • The next thing I ask about is breakfast. I counsel my clients to eat a fat and protein-heavy breakfast as soon as possible after they awaken (no simple carbohydrates, no sugars,
  • I have had many clients whose anxiety was reduced to subclinical levels merely because they started to sleep on a predictable schedule and eat breakfast.
  • Other bad habits can also interfere with the counter’s accuracy.
  • There are many systems of interaction between brain, body and social world that can get caught in positive feedback loops. Depressed people, for example, can start feeling useless and burdensome, as well as grief-stricken and pained. This makes them withdraw from contact with friends and family. Then the withdrawal makes them more lonesome and isolated, and more likely to feel useless and burdensome. Then they withdraw more. In this manner, depression spirals and amplifies.
  • If someone is badly hurt at some point in life—traumatized—the dominance counter can transform in a manner that makes additional hurt more rather than less likely. This often happens in the case of people, now adults, who were viciously bullied during childhood or adolescence. They become anxious and easily upset. They shield themselves with a defensive crouch, and avoid the direct eye contact interpretable as a dominance challenge.
  • With their capacity for aggression strait-jacketed within a too-narrow morality, those who are only or merely compassionate and self-sacrificing (and naïve and exploitable) cannot call forth the genuinely righteous and appropriately self-protective anger necessary to defend themselves. If you can bite, you generally don’t have to. When skillfully integrated, the ability to respond with aggression and violence decreases rather than increases the probability that actual aggression will become necessary.
  • Naive, harmless people usually guide their perceptions and actions with a few simple axioms: people are basically good; no one really wants to hurt anyone else; the threat (and, certainly, the use) of force, physical or otherwise, is wrong. These axioms collapse, or worse, in the presence of
  • individuals who are genuinely malevolent.27
  • I have had clients who were terrified into literally years of daily hysterical convulsions by the sheer look of malevolence on their attackers’ faces. Such individuals typically come from hyper-sheltered families, where nothing
  • terrible is allowed to exist, and everything is fairyland wonderful (or else).
  • When the wakening occurs—when once-naïve people recognize in themselves the seeds of evil and monstrosity, and see themselves as dangerous (at least potentially)— their fear decreases. They develop more self-respect. Then, perhaps, they begin to resist oppression. They see that they have the ability to withstand, because they are terrible too. They see they can and must stand up, because they begin to understand how genuinely monstrous they will become, otherwise,
  • There is very little difference between the capacity for mayhem and destruction, integrated, and strength of character. This is one of the most difficult lessons of life.
  • even if you came by your poor posture honestly—even if you were unpopular or bullied at home or in grade school28—it’s not necessarily appropriate now. Circumstances change. If you slump around, with the same bearing that characterizes a defeated lobster, people will assign you a lower status, and the old counter that you share with crustaceans, sitting at the very base of your brain, will assign you a low dominance number.
  • the other, far more optimistic lesson of Price’s law and the Pareto distribution: those who start to have will probably get more.
  • Some of these upwardly moving loops can occur in your own private, subjective space.
  • If you are asked to move the muscles one by one into a position that looks happy, you will report feeling happier. Emotion is partly bodily expression, and can be amplified (or dampened) by that expression.29
  • To stand up straight with your shoulders back is to accept the terrible responsibility of life, with eyes wide open.
  • It means deciding to voluntarily transform the chaos of potential into the realities of habitable order. It means adopting the burden of self-conscious vulnerability, and accepting the end of the unconscious paradise of childhood, where finitude and mortality are only dimly comprehended. It means willingly undertaking the sacrifices necessary to generate a productive and meaningful reality (it means acting to please God, in the ancient language).
  • So, attend carefully to your posture. Quit drooping and hunching around. Speak your mind. Put your desires forward, as if you had a right to them—at least the same right as others. Walk tall and gaze forthrightly ahead. Dare to be dangerous. Encourage the serotonin to flow plentifully through the neural pathways desperate for its calming influence.
  • Thus emboldened, you will embark on the voyage of your life, let your light shine, so to speak, on the heavenly hill, and pursue your rightful destiny. Then the meaning of your life may be sufficient to keep the corrupting influence of mortal despair at bay. Then you may be able to accept the terrible burden of the World, and find joy.
  • RULE 2   TREAT YOURSELF LIKE SOMEONE YOU ARE RESPONSIBLE FOR HELPING
  • People are better at filling and properly administering prescription medication to their pets than to themselves. That
  • It is difficult to conclude anything from this set of facts except that people appear to love their dogs, cats, ferrets and birds (and maybe even their lizards) more than themselves. How horrible is that? How much shame must exist, for something like that to be true? What could it be about people that makes them prefer their pets to themselves?
  • To understand Genesis 1, the Priestly story, with its insistence on speech as the fundamental creative force, it is first necessary to review a few fundamental, ancient assumptions (these are markedly different in type and intent from the assumptions of science, which are, historically speaking, quite novel).
  • those who existed during the distant time in which the foundational epics of our culture emerged were much more concerned with the actions that dictated survival (and with interpreting the world in a manner commensurate with that goal) than with anything approximating what we now understand as objective truth.
  • Before the dawn of the scientific worldview, reality was construed differently. Being was understood as a place of action, not a place of things.31 It was understood as something more akin to story or drama. That story or drama was lived, subjective experience, as it manifested itself moment to moment in the consciousness of every living person.
  • subjective pain. That’s something so real no argument can stand against it. Everyone acts as if their pain is real—ultimately, finally real. Pain matters, more than matter matters. It is for this reason, I believe, that so many of the world’s traditions regard the suffering attendant upon existence as the irreducible truth of Being.
  • In any case, that which we subjectively experience can be likened much more to a novel or a movie than to a scientific description of physical reality.
  • The Domain, Not of Matter, but of What Matters
  • the world of experience has primal constituents, as well. These are the necessary elements whose interactions define drama and fiction. One of these is chaos. Another is order. The third (as there are three) is the process that mediates between the two, which appears identical to what modern people call consciousness.
  • Chaos is the domain of ignorance itself. It’s unexplored territory. Chaos is what extends, eternally and without limit, beyond the boundaries of all states, all ideas, and all disciplines. It’s the foreigner, the stranger, the member of another gang, the rustle in the bushes in the night-time,
  • It is, in short, all those things and situations we neither know nor understand.
  • Chaos is also the formless potential from which the God of Genesis 1 called forth order using language at the beginning of time. It’s the same potential from which we, made in that Image, call forth the novel and ever-changing moments of our lives. And Chaos is freedom, dreadful freedom, too.
  • Order, by contrast, is explored territory. That’s the hundreds-of-millions-of-years-old hierarchy of place, position and authority. That’s the structure of society. It’s the structure provided by biology, too—particularly insofar as you are adapted, as you are, to the structure of society. Order is tribe, religion, hearth, home and country.
  • Order is the public façade we’re called upon to wear, the politeness of a gathering of civilized strangers, and the thin ice on which we all skate. Order is the place where the behavior of the world matches our expectations and our desires; the place where all things turn out the way we want them to.
  • But order is sometimes tyranny and stultification, as well, when the demand for certainty and uniformity and purity becomes too one-sided.
  • In order, we’re able to think about things in the long term. There, things work, and we’re stable, calm and competent. We seldom leave places we
  • understand—geographical or conceptual—for that reason, and we certainly do not like it when we are compelled to or when it happens accidentally.
  • When the same person betrays you, sells you out, you move from the daytime world of clarity and light to the dark underworld of chaos, confusion and despair. That’s the same move you make, and the same place you visit, when the company you work for starts to fail and your job is placed in doubt.
  • Before the Twin Towers fell—that was order. Chaos manifested itself afterward. Everyone felt it. The very air became uncertain. What exactly was it that fell? Wrong question. What exactly remained standing? That was the issue at hand.
  • Chaos is the deep ocean bottom to which Pinocchio voyaged to rescue his father from Monstro, whale and fire-breathing dragon. That journey into darkness and rescue is the most difficult thing a puppet must do, if he wants to be real; if he wants to extract himself from the temptations of deceit and acting and victimization and impulsive pleasure and totalitarian subjugation; if he wants to take his place as a genuine Being in the world.
  • Chaos is the new place and time that emerges when tragedy strikes suddenly, or malevolence reveals its paralyzing visage, even in the confines of your own home. Something unexpected or undesired can always make its appearance, when a plan is being laid out, regardless of how familiar the circumstances.
  • Our brains respond instantly when chaos appears, with simple, hyper-fast circuits maintained from the ancient days, when our ancestors dwelled in trees, and snakes struck in a flash.32 After that nigh-instantaneous, deeply reflexive bodily response comes the later-evolving, more complex but slower responses of emotions—and, after that, comes thinking, of the higher order, which can extend over seconds, minutes or years. All that response is instinctive, in some sense—but the faster the response, the more instinctive.
  • Things or objects are part of the objective world. They’re inanimate; spiritless. They’re dead. This is not true of chaos and order. Those are perceived, experienced and understood (to the degree that they are understood at all) as personalities—and that is just as true of the perceptions, experiences and understanding of modern people as their ancient forebears. It’s just that moderners don’t notice.
  • Perception of things as entities with personality also occurs before perception of things as things. This is particularly true of the action of others,34 living others, but we also see the non-living “objective world” as animated, with purpose and intent.
  • This is because of the operation of what psychologists have called “the hyperactive agency detector” within us.35 We evolved, over millennia, within intensely social circumstances. This means that the most significant elements of our environment of origin were personalities, not things, objects or situations.
  • The personalities we have evolved to perceive have been around, in predictable form, and in typical, hierarchical configurations, forever, for all intents and purposes. They have been…
  • the category of “parent” and/or “child” has been around for 200 million years. That’s longer than birds have existed. That’s longer than flowers have grown. It’s not a billion years, but it’s still a very long time. It’s plenty long enough for male and female and parent and child to serve as vital and fundamental parts of the environment to which we have adapted. This means that male and female and parent and child are…
  • Our brains are deeply social. Other creatures (particularly, other humans) were crucially important to us as we lived, mated and evolved. Those creatures were…
  • From a Darwinian perspective, nature—reality itself; the environment, itself—is what selects. The environment cannot be defined in any more fundamental manner. It is not mere inert matter. Reality itself is whatever we contend with when we are striving to survive and reproduce. A…
  • as our brain capacity increased and we developed curiosity to spare, we became increasingly aware of and curious about the nature of the world—what we eventually conceptualized as the objective…
  • “outside” is not merely unexplored physical territory. Outside is outside of what we currently understand—and understanding is dealing with and coping with…
  • when we first began to perceive the unknown, chaotic, non-animal world, we used categories that had originally evolved to represent the pre-human animal social world. Our minds are far older than mere…
  • Our most…
  • category—as old, in some sense, as the sexual act itself—appears to be that of sex, male and female. We appear to have taken that primordial knowledge of structured, creative opposition and…
  • Order, the known, appears symbolically associated with masculinity (as illustrated in the aforementioned yang of the Taoist yin-yang symbol). This is perhaps because the primary…
  • Chaos—the unknown—is symbolically associated with the feminine. This is partly because all the things we have come to know were born, originally, of the unknown, just as all beings we encounter were born of mothers. Chaos is mater, origin, source, mother; materia, the substance from which all things are made.
  • In its positive guise, chaos is possibility itself, the source of ideas, the mysterious realm of gestation and birth. As a negative force, it’s the impenetrable darkness of a cave and the accident by the side of the road.
  • Chaos, the eternal feminine, is also the crushing force of sexual selection.
  • Most men do not meet female human standards. It is for this reason that women on dating sites rate 85 percent of men as below average in attractiveness.40
  • Women’s proclivity to say no, more than any other force, has shaped our evolution into the creative, industrious, upright, large-brained (competitive, aggressive, domineering) creatures that we are.42 It is Nature as Woman who says, “Well, bucko, you’re good enough for a friend, but my experience of you so far has not indicated the suitability of your genetic material for continued propagation.”
  • Many things begin to fall into place when you begin to consciously understand the world in this manner. It’s as if the knowledge of your body and soul falls into alignment with the knowledge of your intellect.
  • And there’s more: such knowledge is proscriptive, as well as descriptive. This is the kind of knowing what that helps you know how. This is the kind of is from which you can derive an ought. The Taoist juxtaposition of yin and yang, for example, doesn’t simply portray chaos and order as the fundamental elements of Being—it also tells you how to act.
  • The Way, the Taoist path of life, is represented by (or exists on) the border between the twin serpents. The Way is the path of proper Being. It’s the same Way as that referred to by Christ in John 14:6: I am the way, and the truth and the life. The same idea is expressed in Matthew 7:14: Because strait is the gate, and narrow is the way, which leadeth unto life, and few there be that find it.
  • We eternally inhabit order, surrounded by chaos. We eternally occupy known territory, surrounded by the unknown. We experience meaningful engagement when we mediate appropriately between them. We are adapted, in the deepest Darwinian sense, not to the world of objects, but to the meta-realities of order and chaos, yang and yin. Chaos and order make up the eternal, transcendent environment of the living.
  • To straddle that fundamental duality is to be balanced: to have one foot firmly planted in order and security, and the other in chaos, possibility, growth and adventure.
  • Chaos and order are fundamental elements because every lived situation (even every conceivable lived situation) is made up of both.
  • you need to place one foot in what you have mastered and understood and the other in what you are currently exploring and mastering. Then you have positioned yourself where the terror of existence is under control and you are secure, but where you are also alert and engaged. That is where there is something new to master and some way that you can be improved. That is where meaning is to be found.
  • The serpent in Eden therefore means the same thing as the black dot in the yin side of the Taoist yin/yang symbol of totality—that is, the possibility of the unknown and revolutionary suddenly manifesting itself where everything appears calm.
  • The outside, chaos, always sneaks into the inside, because nothing can be completely walled off from the rest of reality. So even the ultimate in safe spaces inevitably harbours a snake.
  • We have seen the enemy, after all, and he is us. The snake inhabits each of our souls.
  • The worst of all possible snakes is the eternal human proclivity for evil. The worst of all possible snakes is psychological, spiritual, personal, internal. No walls, however tall, will keep that out. Even if the fortress were thick enough, in principle, to keep everything bad whatsoever outside, it would immediately appear again within.
  • I have learned that these old stories contain nothing superfluous. Anything accidental—anything that does not serve the plot—has long been forgotten in the telling. As the Russian playwright Anton Chekhov advised, “If there is a rifle hanging on the wall in act one, it must be fired in the next act. Otherwise it has no
  • business being there.”50
  • Eve immediately shares the fruit with Adam. That makes him self-conscious. Little has changed. Women have been making men self-conscious since the beginning of time. They do this primarily by rejecting them—but they also do it by shaming them, if men do not take responsibility. Since women bear the primary burden of reproduction, it’s no wonder. It is very hard to see how it could be otherwise. But the capacity of women to shame men and render them self-conscious is still a primal force of nature.
  • What does it mean to know yourself naked
  • Naked means vulnerable and easily damaged. Naked means subject to judgment for beauty and health. Naked means unprotected and unarmed in the jungle of nature and man. This is why Adam and Eve became ashamed, immediately after their eyes were opened. They could see—and what they first saw was themselves.
  • In their vulnerability, now fully realized, they felt unworthy to stand before God.
  • Beauty shames the ugly. Strength shames the weak. Death shames the living—and the Ideal shames us all.
  • He tells the woman that she will now bring forth children in sorrow, and desire an unworthy, sometimes resentful man, who will in consequence lord her biological fate over her, permanently. What might this mean? It could just mean that God is a patriarchal tyrant, as politically motivated interpretations of the ancient story insist. I think it’s merely descriptive.
  • women pay a high price for pregnancy and child-rearing, particularly in the early stages, and that one of the inevitable consequences is increased dependence upon the sometimes unreliable and always problematic good graces of men.
  • then God banishes the first man and the first woman from Paradise, out of infancy, out of the unconscious animal world, into the horrors of history itself. And then He puts cherubim and a flaming sword at the gate of Eden, just to stop them from eating the Fruit of the Tree of Life.
  • Perhaps Heaven is something you must build, and immortality something you must earn.
  • so we return to our original query: Why would someone buy prescription medication for his dog, and then so carefully administer it, when he would not do the same for himself?
  • Why should anyone take care of anything as naked, ugly, ashamed, frightened, worthless, cowardly, resentful, defensive and accusatory as a descendant of Adam? Even if that thing, that being, is himself?
  • We know how we are naked, and how that nakedness can be exploited—and that means we know how others are naked, and how they can be exploited. We can terrify other people, consciously. We can hurt and humiliate them for faults we understand only too well. We can torture them—literally—slowly, artfully and terribly. That’s far more than predation. That’s a qualitative shift in understanding. That’s a cataclysm as large as the development of self-consciousness itself. That’s the entry of the knowledge of Good and Evil into the world.
  • Only man could conceive of the rack, the iron maiden and the thumbscrew. Only man will inflict suffering for the sake of suffering. That is the best definition of evil I have been able to formulate.
  • with this realization we have well-nigh full legitimization of the idea, very unpopular in modern intellectual circles, of Original Sin.
  • Human beings have a great capacity for wrongdoing. It’s an attribute that is unique in the world of life. We can and do make things worse, voluntarily, with full knowledge of what we are doing (as well as accidentally, and carelessly, and in a manner that is willfully blind). Given that terrible capacity, that proclivity for malevolent actions, is it any wonder we have a hard time taking care of ourselves, or others—or even that we doubt the value of the entire human enterprise?
  • The juxtaposition of Genesis 1 with Genesis 2 & 3 (the latter two chapters outlining the fall of man, describing why our lot is so tragedy-ridden and ethically torturous) produces a narrative sequence almost unbearable in its profundity. The moral of Genesis 1 is that Being brought into existence through true speech is Good.
  • The original Man and Woman, existing in unbroken unity with their Creator, did not appear conscious (and certainly not self-conscious). Their eyes were not open. But, in their perfection, they were also less, not more, than their post-Fall counterparts. Their goodness was something bestowed, rather than deserved or earned.
  • Maybe, even in some cosmic sense (assuming that consciousness itself is a phenomenon of cosmic significance), free choice matters.
  • here’s a proposition: perhaps it is not simply the emergence of self-consciousness and the rise of our moral knowledge of Death and the Fall that besets us and makes us doubt our own worth. Perhaps it is instead our unwillingness—reflected in Adam’s shamed hiding—to walk with God, despite our fragility and propensity for evil.
  • The entire Bible is structured so that everything after the Fall—the history of Israel, the prophets, the coming of Christ—is presented as a remedy for that Fall, a way out of evil. The beginning of conscious history, the rise of the state and all its pathologies of pride and rigidity, the emergence of great moral figures who try to set things right, culminating in the Messiah Himself—that is all part of humanity’s attempt, God willing, to set itself right. And what would that mean?
  • And this is an amazing thing: the answer is already implicit in Genesis 1: to embody the Image of God—to speak out of chaos the Being that is Good—but to do so consciously, of our own free choice.
  • Back is the way forward—as T. S. Eliot so rightly insisted
  • We shall not cease from exploration And the end of all our exploring Will be to arrive where we started And know the place for the first time.
  • If we wish to take care of ourselves properly, we would have to respect ourselves—but we don’t, because we are—not least in our own eyes—fallen creatures.
  • If we lived in Truth; if we spoke the Truth—then we could walk with God once again, and respect ourselves, and others, and the world. Then we might treat ourselves like people we cared for.
  • We might strive to set the world straight. We might orient it toward Heaven, where we would want people we cared for to dwell, instead of Hell, where our resentment and hatred would eternally sentence everyone.
  • Then, the primary moral issue confronting society was control of violent, impulsive selfishness and the mindless greed and brutality that accompanies it.
  • It is easy to believe that people are arrogant, and egotistical, and always looking out for themselves. The cynicism that makes that opinion a universal truism is widespread and fashionable.
  • But such an orientation to the world is not at all characteristic of many people. They have the opposite problem: they shoulder intolerable burdens of self-disgust, self-contempt, shame and self-consciousness. Thus, instead of narcissistically inflating their own importance, they don’t value themselves at all, and they don’t take care of themselves with attention and skill.
  • Christ’s archetypal death exists as an example of how to accept finitude, betrayal and tyranny heroically—how to walk with God despite the tragedy of self-conscious knowledge—and not as a directive to victimize ourselves in the service of others.
  • To sacrifice ourselves to God (to the highest good, if you like) does not mean to suffer silently and willingly when some person or organization demands more from us, consistently, than is offered in return. That means we are supporting tyranny, and allowing ourselves to be treated like slaves.
  • I learned two very important lessons from Carl Jung, the famous Swiss depth psychologist, about “doing unto others as you would have them do unto you” or “loving your neighbour as yourself.”
  • The first lesson was that neither of these statements has anything to do with being nice. The second was that both are equations, rather than injunctions.
  • If I am someone’s friend, family member, or lover, then I am morally obliged to bargain as hard on my own behalf as they are on theirs.
  • there is little difference between standing up and speaking for yourself, when you are being bullied or otherwise tormented and enslaved, and standing up and speaking for someone else.
  • you do not simply belong to yourself. You are not simply your own possession to torture and mistreat. This is partly because your Being is inexorably tied up with that of others, and your mistreatment of yourself can have catastrophic consequences for others.
  • metaphorically speaking, there is also this: you have a spark of the divine in you, which belongs not to you, but to God. We are, after all—according to Genesis—made in His image.
  • We can make order from chaos—and vice versa—in our way, with our words. So, we may not exactly be God, but we’re not exactly nothing, either.
  • In my own periods of darkness, in the underworld of the soul, I find myself frequently overcome and amazed by the ability of people to befriend each other, to love their intimate partners and parents and children, and to do what they must do to keep the machinery of the world running.
  • It is this sympathy that should be the proper medicament for self-conscious self-contempt, which has its justification, but is only half the full and proper story. Hatred for self and mankind must be balanced with gratefulness for tradition and the state and astonishment at what normal, everyday people accomplish
  • You have some vital role to play in the unfolding destiny of the world. You are, therefore, morally obliged to take care of yourself.
  • To treat yourself as if you were someone you are responsible for helping is, instead, to consider what would be truly good for you. This is not “what you want.” It is also not “what would make you happy.”
  • You must help a child become a virtuous, responsible, awake being, capable of full reciprocity—able to take care of himself and others, and to thrive while doing so. Why would you think it acceptable to do anything less for yourself?
  • You need to know who you are, so that you understand your armament and bolster yourself in respect to your limitations. You need to know where you are going, so that you can limit the extent of chaos in your life, restructure order, and bring the divine force of Hope to bear on the world.
  • You need to determine how to act toward yourself so that you are most likely to become and to stay a good person.
  • Don’t underestimate the power of vision and direction. These are irresistible forces, able to transform what might appear to be unconquerable obstacles into traversable pathways and expanding opportunities.
  • Once having understood Hell, researched it, so to speak—particularly your
  • own individual Hell—you could decide against going there or creating that.
  • You could, in fact, devote your life to this. That would give you a Meaning, with a capital M. That would justify your miserable existence.
  • That would atone for your sinful nature, and replace your shame and self-consciousness with the natural pride and forthright confidence of someone who has learned once again to walk with God in the Garden.
  • RULE 3   MAKE FRIENDS WITH PEOPLE WHO WANT THE BEST FOR YOU
  • It would be more romantic, I suppose, to suggest that we would have all jumped at the chance for something more productive, bored out of our skulls as we were. But it’s not true. We were all too prematurely cynical and world-weary and leery of responsibility to stick to the debating clubs and Air Cadets and school sports that the adults around us tried to organize. Doing anything wasn’t cool.
  • When you move, everything is up in the air, at least for a while. It’s stressful, but in the chaos there are new possibilities. People, including you, can’t hem you in with their old notions. You get shaken out of your ruts. You can make new, better ruts, with people aiming at better things. I thought this was just a natural development. I thought that every person who moved would have—and want—the same phoenix-like experience.
  • What was it that made Chris and Carl and Ed unable (or, worse, perhaps, unwilling) to move or to change their friendships and improve the circumstances of their lives? Was it inevitable—a consequence of their own limitations, nascent illnesses and traumas of the past?
  • Why did he—like his cousin, like my other friends—continually choose people who, and places that, were not good for him?
  • perhaps, they don’t want the trouble of better. Freud called this a “repetition compulsion.” He thought of it as an unconscious drive to repeat the horrors of the past
  • People create their worlds with the tools they have directly at hand. Faulty tools produce faulty results. Repeated use of the same faulty tools produces the same faulty results.
  • It is in this manner that those who fail to learn from the past doom themselves to repeat it. It’s partly fate. It’s partly inability. It’s partly…unwillingness to learn? Refusal to learn? Motivated refusal to learn?
  • People choose friends who aren’t good for them for other reasons, too. Sometimes it’s because they want to rescue someone.
  • it is not easy to distinguish between someone truly wanting and needing help and someone who is merely exploiting a willing helper. The distinction is difficult even for the person who is wanting and needing and possibly exploiting.
  • When it’s not just naïveté, the attempt to rescue someone is often fuelled by vanity and narcissism.
  • But Christ himself, you might object, befriended tax-collectors and prostitutes. How dare I cast aspersions on the motives of those who are trying to help? But Christ was the archetypal perfect man. And you’re you.
  • How do you know that your attempts to pull someone up won’t instead bring them—or you—further down?
  • The same thing happens when well-meaning counsellors place a delinquent teen among comparatively civilized peers. The delinquency spreads, not the stability.65 Down is a lot easier than up.
  • maybe you’re saving someone because you want to convince yourself that the strength of your character is more than just a side effect of your luck and birthplace. Or maybe it’s because it’s easier to look virtuous when standing alongside someone utterly irresponsible.
  • Or maybe you have no plan, genuine or otherwise, to rescue anybody. You’re associating with people who are bad for you not because it’s better for anyone, but because it’s easier.
  • You know it. Your friends know it. You’re all bound by an implicit contract—one aimed at nihilism, and failure, and suffering of the stupidest sort.
  • Before you help someone, you should find out why that person is in trouble. You shouldn’t merely assume that he or she is a noble victim of unjust circumstances and exploitation. It’s the most unlikely explanation, not the most probable.
  • Besides, if you buy the story that everything terrible just happened on its own, with no personal responsibility on the part of the victim, you deny that person all agency in the past (and, by implication, in the present and future, as well).
  • It is far more likely that a given individual has just decided to reject the path upward, because of its difficulty. Perhaps that should even be your default assumption, when faced with such a situation.
  • failure is easy to understand. No explanation for its existence is required. In the same manner, fear, hatred, addiction, promiscuity, betrayal and deception require no explanation. It’s not the existence of vice, or the indulgence in it, that requires explanation. Vice is easy.
  • Failure is easy, too. It’s easier not to shoulder a burden. It’s easier not to think, and not to do, and not to care. It’s easier to put off until tomorrow what needs to be done today,
  • Success: that’s the mystery. Virtue: that’s what’s inexplicable. To fail, you merely have to cultivate a few bad habits. You just have to bide your time. And once someone has spent enough time cultivating bad habits and biding their time, they are much diminished.
  • I am not saying that there is no hope of redemption. But it is much harder to extract someone
  • from a chasm than to lift him from a ditch. And some chasms are very deep. And there’s not much left of the body at the bottom.
  • Carl Rogers, the famous humanistic psychologist, believed it was impossible to start a therapeutic relationship if the person seeking help did not want to improve.67 Rogers believed it was impossible to convince someone to change for the better. The
  • none of this is a justification for abandoning those in real need to pursue your narrow, blind ambition, in case it has to be said.
  • Here’s something to consider: If you have a friend whose friendship you wouldn’t recommend to your sister, or your father, or your son, why would you have such a friend for yourself?
  • You are not morally obliged to support someone who is making the world a worse place. Quite the opposite. You should choose people who want things to be better, not worse. It’s a good thing, not a selfish thing, to choose people who are good for you.
  • It is for this reason that every good example is a fateful challenge, and every hero, a judge. Michelangelo’s great perfect marble David cries out to its observer: “You could be more than you are.”
  • Don’t think that it is easier to surround yourself with good healthy people than with bad unhealthy people. It’s not. A good, healthy person is an ideal. It requires strength and daring to stand up near such a person.
  • RULE 4   COMPARE YOURSELF TO WHO YOU WERE YESTERDAY, NOT TO WHO SOMEONE ELSE IS TODAY
  • IT WAS EASIER FOR PEOPLE to be good at something when more of us lived in small, rural communities. Someone could be homecoming queen. Someone else could be spelling-bee champ, math whiz or basketball star. There were only one or two mechanics and a couple of teachers. In each of their domains, these local heroes had the opportunity to enjoy the serotonin-fuelled confidence of the victor.
  • Our hierarchies of accomplishment are now dizzyingly vertical.
  • No matter how good you are at something, or how you rank your accomplishments, there is someone out there who makes you look incompetent.
  • We are not equal in ability or outcome, and never will be. A very small number of people produce very much of everything.
  • People are unhappy at the bottom. They get sick there, and remain unknown and unloved. They waste their lives there. They die there. In consequence, the self-denigrating voice in the minds of people weaves a devastating tale. Life is a zero-sum game. Worthlessness is the default condition.
  • It is for such reasons that a whole generation of social psychologists recommended “positive illusions” as the only reliable route to mental health.69 Their credo? Let a lie be your umbrella. A more dismal, wretched, pessimistic philosophy can hardly be imagined:
  • Here is an alternative approach (and one that requires no illusions). If the cards are always stacked against you, perhaps the game you are playing is somehow rigged (perhaps by you, unbeknownst to yourself). If the internal voice makes you doubt the value of your endeavours—or your life, or life itself—perhaps you should stop listening.
  • There will always be people better than you—that’s a cliché of nihilism, like the phrase, In a million years, who’s going to know the difference? The proper response to that statement is not, Well, then, everything is meaningless. It’s, Any idiot can choose a frame of time within which nothing matters.
  • Standards of better or worse are not illusory or unnecessary. If you hadn’t decided that what you are doing right now was better than the alternatives, you wouldn’t be doing it. The idea of a value-free choice is a contradiction in terms. Value judgments are a precondition for action.
  • Furthermore, every activity, once chosen, comes with its own internal standards of accomplishment. If something can be done at all, it can be done better or worse. To do anything at all is therefore to play a game with a defined and valued end, which can always be reached more or less efficiently and elegantly.
  • We might start by considering the all-too-black-and-white words themselves: “success” or “failure.” You are either a success, a comprehensive, singular, over-all good thing, or its opposite, a failure, a comprehensive, singular, irredeemably bad thing.
  • There are vital degrees and gradations of value obliterated by this binary system, and the consequences are not good.
  • there is not just one game at which to succeed or fail. There are many games and, more specifically, many good games—
  • if changing games does not work, you can invent a new one. I
  • and athletic pursuits. You might consider judging your success across all the games you play.
  • When we are very young we are neither individual nor informed. We have not had the time nor gained the wisdom to develop our own standards. In consequence, we must compare ourselves to others, because standards are necessary.
  • As we mature we become, by contrast, increasingly individual and unique. The conditions of our lives become more and more personal and less and less comparable with those of others. Symbolically speaking, this means we must leave the house ruled by our father, and confront the chaos of our individual Being.
  • We must then rediscover the values of our culture—veiled from us by our ignorance, hidden in the dusty treasure-trove of the past—rescue them, and integrate them into our own lives. This is what gives existence its full and necessary meaning.
  • What is it that you actually love? What is it that you genuinely want? Before you can articulate your own standards of value, you must see yourself as a stranger—and then you must get to know yourself. What
  • Dare to be truthful. Dare to articulate yourself, and express (or at least become aware of) what would really justify your life.
  • Consult your resentment. It’s a revelatory emotion, for all its pathology. It’s part of an evil triad: arrogance, deceit, and resentment. Nothing causes more harm than this underworld Trinity. But resentment always means one of two things. Either the resentful person is immature, in which case he or she should shut up, quit whining, and get on with it, or there is tyranny afoot—in which case the person subjugated has a moral obligation to speak up.
  • Be cautious when you’re comparing yourself to others. You’re a singular being, once you’re an adult. You have your own particular, specific problems—financial, intimate, psychological, and otherwise.
  • Those are embedded in the unique broader context of your existence. Your career or job works for you in a personal manner, or it does not, and it does so in a unique interplay with the other specifics of your life.
  • We must see, but to see, we must aim, so we are always aiming. Our minds are built on the hunting-and-gathering platforms of our bodies. To hunt is to specify a target, track it, and throw at it.
  • We live within a framework that defines the present as eternally lacking and the future as eternally better. If we did not see things this way, we would not act at all. We wouldn’t even be able to see, because to see we must focus, and to focus we must pick one thing above all else on which to focus.
  • The disadvantage to all this foresight and creativity is chronic unease and discomfort. Because we always contrast what is with what could be, we have to aim at what could be.
  • The present is eternally flawed. But where you start might not be as important as the direction you are heading. Perhaps happiness is always to be found in the journey uphill, and not in the fleeting sense of satisfaction awaiting at the next peak.
  • Called upon properly, the internal critic will suggest something to set in order, which you could set in order, which you would set in order—voluntarily, without resentment, even with pleasure.
  • “Excuse me,” you might say to yourself, without irony or sarcasm. “I’m trying to reduce some of the unnecessary suffering around here. I could use some help.” Keep the derision at bay. “I’m wondering if there is anything that you would be willing to do? I’d be very grateful for your service.” Ask honestly and with humility. That’s no simple matter.
Javier E

Economics of Good and Evil: The Quest for Economic Meaning from Gilgamesh to Wall Stree... - 2 views

  • Instead of self-confident and self-centered answers, the author humbly asks fundamental questions: What is economics? What is its meaning? Where does this new religion, as it is sometimes called, come from? What are its possibilities and its limitations and borders, if there are any? Why are we so dependent on permanent growing of growth and growth of growing of growth? Where did the idea of progress come from, and where is it leading us? Why are so many economic debates accompanied by obsession and fanaticism?
  • The majority of our political parties act with a narrow materialistic focus when, in their programs, they present the economy and finance first; only then, somewhere at the end, do we find culture as something pasted on or as a libation for a couple of madmen.
  • most of them—consciously or unconsciously—accept and spread the Marxist thesis of the economic base and the spiritual superstructure.
  • ...297 more annotations...
  • He tries to break free of narrow specialization and cross the boundaries between scientific disciplines. Expeditions beyond economics’ borders and its connection to history, philosophy, psychology, and ancient myths are not only refreshing, but necessary for understanding the world of the twenty-first century.
  • Reality is spun from stories, not from material. Zdeněk Neubauer
  • “The separation between the history of a science, its philosophy, and the science itself dissolves into thin air, and so does the separation between science and non-science; differences between the scientific and unscientific are vanishing.”
  • Outside of our history, we have nothing more.
  • The study of the history of a certain field is not, as is commonly held, a useless display of its blind alleys or a collection of the field’s trials and errors (until we got it right), but history is the fullest possible scope of study of a menu that the given field can offer.
  • History of thought helps us to get rid of the intellectual brainwashing of the age, to see through the intellectual fashion of the day, and to take a couple of steps back.
  • Almost all of the key concepts by which economics operates, both consciously and unconsciously, have a long history, and their roots extend predominantly outside the range of economics, and often completely beyond that of science.
  • That is the reason for this book: to look for economic thought in ancient myths and, vice versa, to look for myths in today’s economics.
  • stories; Adam Smith believed. As he puts it in The Theory of Moral Sentiments, “the desire of being believed, or the desire of persuading, of leading and directing other people, seems to be one of the strongest of all our natural desires.”
  • “The human mind is built to think in terms of narratives … in turn, much of human motivation comes from living through a story of our lives, a story that we tell to ourselves and that creates a framework of our motivation. Life could be just ‘one damn thing after another’ if it weren’t for such stories. The same is true for confidence in a nation, a company, or an institution. Great leaders are foremost creators of stories.”
  • contrary to what our textbooks say, economics is predominantly a normative field. Economics not only describes the world but is frequently about how the world should be (it should be effective, we have an ideal of perfect competition, an ideal of high-GDP growth in low inflation, the effort to achieve high competitiveness …). To this end, we create models, modern parables,
  • I will try to show that mathematics, models, equations, and statistics are just the tip of the iceberg of economics; that the biggest part of the iceberg of economic knowledge consists of everything else; and that disputes in economics are rather a battle of stories and various metanarratives than anything else.
  • Before it was emancipated as a field, economics lived happily within subsets of philosophy—ethics, for example—miles away from today’s concept of economics as a mathematical-allocative science that views “soft sciences” with a scorn born from positivistic arrogance. But our thousand-year “education” is built on a deeper, broader, and oftentimes more solid base. It is worth knowing about.
  • is a paradox that a field that primarily studies values wants to be value-free. One more paradox is this: A field that believes in the invisible hand of the market wants to be without mysteries.
  • mathematics at the core of economics, or is it just the icing of the cake, the tip of the iceberg of our field’s inquiry?
  • we seek to chart the development of the economic ethos. We ask questions that come before any economic thinking can begin—both philosophically and, to a degree, historically. The area here lies at the very borders of economics—and often beyond. We may refer to this as protoeconomics (to borrow a term from protosociology) or, perhaps more fittingly, metaeconomics (to borrow a term from metaphysics).
  • In this sense, “the study of economics is too narrow and too fragmentary to lead to valid insight, unless complemented and completed by a study of metaeconomics.”17
  • The more important elements of a culture or field of inquiry such as economics are found in fundamental assumptions that adherents of all the various systems within the epoch unconsciously presuppose. Such assumptions appear so obvious that people do not know what they are assuming, because no other way of putting things has ever occurred to them, as the philosopher Alfred Whitehead notes in Adventures of Ideas.
  • I argue that economic questions were with mankind long before Adam Smith. I argue that the search for values in economics did not start with Adam Smith but culminated with him.
  • We should go beyond economics and study what beliefs are “behind the scenes,” ideas that have often become the dominant yet unspoken assumptions in our theories. Economics is surprisingly full of tautologies that economists are predominantly unaware of. I
  • argue that economics should seek, discover, and talk about its own values, although we have been taught that economics is a value-free science. I argue that none of this is true and that there is more religion, myth, and archetype in economics than there is mathematics.
  • In a way, this is a study of the evolution of both homo economicus and, more importantly, the history of the animal spirits within him. This book tries to study the evolution of the rational as well as the emotional and irrational side of human beings.
  • I argue that his most influential contribution to economics was ethical. His other thoughts had been clearly expressed long before him, whether on specialization, or on the principle of the invisible hand of the market. I try to show that the principle of the invisible hand of the market is much more ancient and developed long before Adam Smith. Traces of it appear even in the Epic of Gilgamesh, Hebrew thought, and in Christianity, and it is expressly stated by Aristophanes and Thomas Aquinas.
  • This is not a book on the thorough history of economic thought. The author aims instead to supplement certain chapters on the history of economic thought with a broader perspective and analysis of the influences that often escape the notice of economists and the wider public.
  • Progress (Naturalness and Civilization)
  • The Economy of Good and Evil
  • from his beginnings, man has been marked as a naturally unnatural creature, who for unique reasons surrounds himself with external possessions. Insatiability, both material and spiritual, are basic human metacharacteristics, which appear as early as the oldest myths and stories.
  • the Hebrews, with linear time, and later the Christians gave us the ideal (or amplified the Hebrew ideal) we now embrace. Then the classical economists secularized progress. How did we come to today’s progression of progress, and growth for growth’s sake?
  • The Need for Greed: The History of Consumption and Labor
  • Metamathematics From where did economics get the concept of numbers as the very foundation of the world?
  • All of economics is, in the end, economics of good and evil. It is the telling of stories by people of people to people. Even the most sophisticated mathematical model is, de facto, a story, a parable, our effort to (rationally) grasp the world around us.
  • idea that we can manage to utilize our natural egoism, and that this evil is good for something, is an ancient philosophical and mythical concept. We will also look into the development of the ethos of homo economicus, the birth of “economic man.”
  • The History of Animal Spirits: Dreams Never Sleep
  • Masters of the Truth
  • Originally, truth was a domain of poems and stories, but today we perceive truth as something much more scientific, mathematical. Where does one go (to shop) for the truth? And who “has the truth” in our epoch?
  • Our animal spirits (something of a counterpart to rationality) are influenced by the archetype of the hero and our concept of what is good.
  • The entire history of ethics has been ruled by an effort to create a formula for the ethical rules of behavior. In the final chapter we will show the tautology of Max Utility, and we will discuss the concept of Max Good.
  • The History of the Invisible Hand of the Market and Homo Economicus
  • We understand “economics” to mean a broader field than just the production, distribution, and consumption of goods and services. We consider economics to be the study of human relations that are sometimes expressible in numbers, a study that deals with tradables, but one that also deals with nontradables (friendship, freedom, efficiency, growth).
  • When we mention economics in this book, we mean the mainstream perception of it, perhaps as best represented by Paul Samuelson.
  • By the term homo economicus, we mean the primary concept of economic anthropology. It comes from the concept of a rational individual, who, led by narrowly egotistical motives, sets out to maximize his benefit.
  • the Epic of Gilgamesh bears witness to the opposite—despite the fact that the first written clay fragments (such as notes and bookkeeping) of our ancestors may have been about business and war, the first written story is mainly about great friendship and adventure.
  • there is no mention of either money or war; for example, not once does anyone in the whole epic sell or purchase something.5 No nation conquers another, and we do not encounter a mention even of the threat of violence.
  • Gilgamesh becomes a hero not only due to his strength, but also due to discoveries and deeds whose importance were in large part economic—direct gaining of construction materials in the case of felling the cedar forest, stopping Enkidu from devastating Uruk’s economy, and discovering new desert routes during his expeditions.
  • Even today we live in Gilgamesh’s vision that human relations—and therefore humanity itself—are a disturbance to work and efficiency; that people would perform better if they did not “waste” their time and energy on nonproductive things.
  • is a story of nature and civilization, of heroism, defiance, and the battle against the gods, and evil; an epic about wisdom, immortality, and also futility.
  • But labour is unlike any other commodity. The work environment is of no concern for steel; we do not care about steel’s well-being.16
  • But it is in friendship where—often by-the-way, as a side product, an externality—ideas and deeds are frequently performed or created that together can altogether change the face of society.19 Friendship can go against an ingrained system in places where an individual does not have the courage to do so himself or herself.
  • As Joseph Stiglitz says, One of the great “tricks” (some say “insights”) of neoclassical economics is to treat labour like any other factor of production. Output is written as a function of inputs—steel, machines, and labour. The mathematics treats labour like any other commodity, lulling one into thinking of labour like an ordinary commodity, such as steel or plastic.
  • Even the earliest cultures were aware of the value of cooperation on the working level—today we call this collegiality, fellowship, or, if you want to use a desecrated term, comradeship. These “lesser relationships” are useful and necessary for society and for companies because work can be done much faster and more effectively if people get along with each other on a human level
  • But true friendship, which becomes one of the central themes of the Epic of Gilgamesh, comes from completely different material than teamwork. Friendship, as C. S. Lewis accurately describes it, is completely uneconomical, unbiological, unnecessary for civilization, and an unneeded relationship
  • Here we have a beautiful example of the power of friendship, one that knows how to transform (or break down) a system and change a person. Enkidu, sent to Gilgamesh as a punishment from the gods, in the end becomes his faithful friend, and together they set out against the gods. Gilgamesh would never have gathered the courage to do something like that on his own—nor would Enkidu.
  • Due to their friendship, Gilgamesh and Enkidu then intend to stand up to the gods themselves and turn a holy tree into mere (construction) material they can handle almost freely, thereby making it a part of the city-construct, part of the building material of civilization, thus “enslaving” that which originally was part of wild nature. This is a beautiful proto-example of the shifting of the borders between the sacred and profane (secular)—and to a certain extent also an early illustration of the idea that nature is there to provide cities and people with raw material and production resources.
  • started with Babylonians—rural nature becomes just a supplier of raw materials, resources (and humans the source of human resources). Nature is not the garden in which humans were created and placed, which they should care for and which they should reside in, but becomes a mere reservoir for natural (re)sources.
  • Even today, we often consider the domain of humanity (human relations, love, friendship, beauty, art, etc.) to be unproductive;
  • Both heroes change—each from opposite poles—into humans. In this context, a psychological dimension to the story may be useful: “Enkidu (…) is Gilgamesh’s alter ego, the dark, animal side of his soul, the complement to his restless heart. When Gilgamesh found Enkidu, he changed from a hated tyrant into the protector of his city. (…)
  • To be human seems to be somewhere in between, or both of these two. We
  • this moment of rebirth from an animal to a human state, the world’s oldest preserved epic implicitly hints at something highly important. Here we see what early cultures considered the beginning of civilization. Here is depicted the difference between people and animals or, better, savages. Here the epic quietly describes birth, the awakening of a conscious, civilized human. We are witnesses to the emancipation of humanity from animals,
  • The entire history of culture is dominated by an effort to become as independent as possible from the whims of nature.39 The more developed a civilization is, the more an individual is protected from nature and natural influences and knows how to create around him a constant or controllable environment to his liking.
  • The price we pay for independence from the whims of nature is dependence on our societies and civilizations. The more sophisticated a given society is as a whole, the less its members are able to survive on their own as individuals, without society.
  • The epic captures one of the greatest leaps in the development of the division of labor. Uruk itself is one of the oldest cities of all, and in the epic it reflects a historic step forward in specialization—in the direction of a new social city arrangement. Because of the city wall, people in the city can devote themselves to things other than worrying about their own safety, and they can continue to specialize more deeply.
  • Human life in the city gains a new dimension and suddenly it seems more natural to take up issues going beyond the life span of an individual. “The city wall symbolizes as well as founds the permanence of the city as an institution which will remain forever and give its inhabitants the certainty of unlimited safety, allowing them to start investing with an outlook reaching far beyond the borders of individual life.
  • The wall around the city of Uruk is, among other things, a symbol of an internal distancing from nature, a symbol of revolts against submission to laws that do not come under the control of man and that man can at most discover and use to his benefit.
  • “The chief thing which the common-sense individual wants is not satisfactions for the wants he had, but more, and better wants.”47
  • If a consumer buys something, theoretically it should rid him of one of his needs—and the aggregate of things they need should be decreased by one item. In reality, though, the aggregate of “I want to have” expands together with the growing aggregate of “I have.”
  • can be said that Enkidu was therefore happy in his natural state, because all of his needs were satiated. On the other hand, with people, it appears that the more a person has, the more developed and richer, the greater the number of his needs (including the unsaturated ones).
  • the Old Testament, this relationship is perceived completely differently. Man (humanity) is created in nature, in a garden. Man was supposed to care for the Garden of Eden and live in harmony with nature and the animals. Soon after creation, man walks naked and is not ashamed, de facto the same as the animals. What is characteristic is that man dresses (the natural state of creation itself is not enough for him), and he (literally and figuratively) covers52 himself—in shame after the fall.53
  • Nature is where one goes to hunt, collect crops, or gather the harvest. It is perceived as the saturator of our needs and nothing more. One goes back to the city to sleep and be “human.” On the contrary, evil resides in nature. Humbaba lives in the cedar forest, which also happens to be the reason to completely eradicate it.
  • Symbolically, then, we can view the entire issue from the standpoint of the epic in the following way: Our nature is insufficient, bad, evil, and good (humane) occurs only after emancipation from nature (from naturalness), through culturing and education. Humanity is considered as being in civilization.
  • The city was frequently (at least in older Jewish writings) a symbol of sin, degeneration, and decadence—nonhumanity. The Hebrews were originally a nomadic nation, one that avoided cities. It is no accident that the first important city57 mentioned in the Bible is proud Babylon,58 which God later turns to dust.
  • is enough, for example, to read the Book of Revelation to see how the vision of paradise developed from the deep Old Testament period, when paradise was a garden. John describes his vision of heaven as a city—paradise is in New Jerusalem, a city where the dimensions of the walls(!) are described in detail, as are the golden streets and gates of pearl.
  • Hebrews later also chose a king (despite the unanimous opposition of God’s prophets) and settled in cities, where they eventually founded the Lord’s Tabernacle and built a temple for Him. The city of Jerusalem later gained an illustrious position in all of religion.
  • this time Christianity (as well as the influence of the Greeks) does not consider human naturalness to be an unambiguous good, and it does not have such an idyllic relationship to nature as the Old Testament prophets.
  • If a tendency toward good is not naturally endowed in people, it must be imputed from above through violence or at least the threat of violence.
  • If we were to look at human naturalness as a good, then collective social actions need a much weaker ruling hand. If people themselves have a natural tendency (propensity) toward good, this role does not have to be supplied by the state, ruler, or, if you wish, Leviathan.
  • How does this affect economics?
  • us return for the last time to the humanization of the wild Enkidu, which is a process we can perceive with a bit of imagination as the first seed of the principle of the market’s invisible hand, and therefore the parallels with one of the central schematics of economic thinking.
  • Sometimes it is better to “harness the devil to the plow” than to fight with him. Instead of summoning up enormous energy in the fight against evil, it is better to use its own energy to reach a goal we desire; setting up a mill on the turbulent river instead of futile efforts to remove the current. This is also how Saint Prokop approached it in one of the oldest Czech legends.
  • Enkidu caused damage and it was impossible to fight against him. But with the help of a trap, trick, this evil was transformed into something that greatly benefited civilization.
  • By culturing and “domesticating” Enkidu, humanity tamed the uncontrollable wild and chaotic evil
  • Enkidu devastated the doings (the external, outside-the-walls) of the city. But he was later harnessed and fights at the side of civilization against nature, naturalness, the natural state of things.
  • A similar motif appears a thousand years after the reversal, which is well known even to noneconomists as the central idea of economics: the invisible hand of the market.
  • A similar story (reforming something animally wild and uncultivated in civilizational achievement) is used by Thomas Aquinas in his teachings. Several centuries later, this idea is fully emancipated in the hands of Bernard Mandeville and his Fable of the Bees: or, Private Vices, Publick Benefits. The economic and political aspects of this idea are—often incorrectly—ascribed to Adam Smith.
  • Here the individual does not try anymore to maximize his goods or profits, but what is important is writing his name in human memory in the form of heroic acts or deeds.
  • immortality, one connected with letters and the cult of the word: A name and especially a written name survives the body.”77
  • After this disappointment, he comes to the edge of the sea, where the innkeeper Siduri lives. As tonic for his sorrow, she offers him the garden of bliss, a sort of hedonistic fortress of carpe diem, where a person comes to terms with his mortality and at least in the course of the end of his life maximizes earthly pleasures, or earthly utility.
  • In the second stage, after finding his friend Enkidu, Gilgamesh abandons the wall and sets out beyond the city to maximalize heroism. “In his (…) search of immortal life, Gilgamesh
  • The hero refuses hedonism in the sense of maximizing terrestrial pleasure and throws himself into things that will exceed his life. In the blink of an eye, the epic turns on its head the entire utility maximization role that mainstream economics has tirelessly tried to sew on people as a part of their nature.81
  • It is simpler to observe the main features of our civilization at a time when the picture was more readable—at a time when our civilization was just being born and was still “half-naked.” In other words, we have tried to dig down to the bedrock of our written civilization;
  • today remember Gilgamesh for his story of heroic friendship with Enkidu, not for his wall, which no longer reaches monumental heights.
  • the eleventh and final tablet, Gilgamesh again loses what he sought. Like Sisyphus, he misses his goal just before the climax
  • is there something from it that is valid today? Have we found in Gilgamesh certain archetypes that are in us to this day?
  • The very existence of questions similar to today’s economic ones can be considered as the first observation. The first written considerations of the people of that time were not so different from those today. In other words: The epic is understandable for us, and we can identify with it.
  • We have also been witnesses to the very beginnings of man’s culturing—a great drama based on a liberation and then a distancing from the natural state.
  • Let us take this as a memento in the direction of our restlessness, our inherited dissatisfaction and the volatility connected to it. Considering that they have lasted five thousand years and to this day we find ourselves in harmony with a certain feeling of futility, perhaps these characteristics are inherent in man.
  • Gilgamesh had a wall built that divided the city from wild nature and created a space for the first human culture. Nevertheless, “not even far-reaching works of civilization could satisfy human desire.”
  • Friendship shows us new, unsuspected adventures, gives us the opportunity to leave the wall and to become neither its builder nor its part—to not be another brick in the wall.
  • with the phenomenon of the creation of the city, we have seen how specialization and the accumulation of wealth was born, how holy nature was transformed into a secular supplier of resources, and also how humans’ individualistic ego was emancipated.
  • to change the system, to break down that which is standing and go on an expedition against the gods (to awaken, from naïveté to awakening) requires friendship.
  • For small acts (hunting together, work in a factory), small love is enough: Camaraderie. For great acts, however, great love is necessary, real love: Friendship. Friendship that eludes the economic understanding of quid pro quo. Friendship gives. One friend gives (fully) for the other. That is friendship for life and death,
  • The thought that humanity comes at the expense of efficiency is just as old as humanity itself—as we have shown, subjects without emotion are the ideal of many tyrants.
  • The epic later crashes this idea through the friendship of Gilgamesh and Enkidu. Friendship—the biologically least essential love, which at first sight appears to be unnecessary
  • less a civilized, city person is dependent on nature, the more he or she is dependent on the rest of society. Like Enkidu, we have exchanged nature for society; harmony with (incalculable) nature for harmony with (incalculable) man.
  • human nature good or evil? To this day these questions are key for economic policy: If we believe that man is evil in his nature, therefore that a person himself is dog eat dog (animal), then the hard hand of a ruler is called for. If we believe that people in and of themselves, in their nature, gravitate toward good, then it is possible to loosen up the reins and live in a society that is more laissez-faire.
  • For a concept of historical progress, for the undeification of heroes, rulers, and nature, mankind had to wait for the Hebrews.
  • Because nature is not undeified, it is beyond consideration to explore it, let alone intervene in it (unless a person was a two-thirds god like Gilgamesh). It
  • They practiced money lending, traded in many assets (…) and especially were engaged in the trading of shares on capital markets, worked in currency exchange and frequently figured as mediators in financial transactions (…), they functioned as bankers and participated in emissions of all possible forms.
  • As regards modern capitalism (as opposed to the ancient and medieval periods) … there are activities in it which are, in certain forms, inherently (and completely necessarily) present—both from an economic and legal standpoint.7
  • As early as the “dark” ages, the Jews commonly used economic tools that were in many ways ahead of their time and that later became key elements of the modern economy:
  • Gilgamesh’s story ends where it began. There is a consistency in this with Greek myths and fables: At the end of the story, no progress occurs, no essential historic change; the story is set in indefinite time, something of a temporal limbo.
  • Jews believe in historical progress, and that progress is in this world.
  • For a nation originally based on nomadism, where did this Jewish business ethos come from? And can the Hebrews truly be considered as the architects of the values that set the direction of our civilization’s economic thought?
  • Hebrew religiosity is therefore strongly connected with this world, not with any abstract world, and those who take pleasure in worldly possessions are not a priori doing anything wrong.
  • PROGRESS: A SECULARIZED RELIGION One of the things the writers of the Old Testament gave to mankind is the idea and notion of progress. The Old Testament stories have their development; they change the history of the Jewish nation and tie in to each other. The Jewish understanding of time is linear—it has a beginning and an end.
  • The observance of God’s Commandments in Judaism leads not to some ethereal other world, but to an abundance of material goods (Genesis 49:25–26, Leviticus 26:3–13, Deuteronomy 28:1–13) (…) There are no accusing fingers pointed at
  • There are no echoes of asceticism nor for the cleansing and spiritual effect of poverty. It is fitting therefore, that the founders of Judaism, the Patriarchs Abraham, Isaac and Jacob, were all wealthy men.12
  • about due to a linear understanding of history. If history has a beginning as well as an end, and they are not the same point, then exploration suddenly makes sense in areas where the fruits are borne only in the next generation.
  • What’s more, economic progress has almost become an assumption of modern functional societies. We expect growth. We take it automatically. Today, if nothing “new” happens, if GDP does not grow (we say it stagnates) for several quarters, we consider it an anomaly.
  • however, the idea of progress itself underwent major changes, and today we perceive it very differently. As opposed to the original spiritual conceptions, today we perceive progress almost exclusively in an economic or scientific-technological sense.
  • Because care for the soul has today been replaced by care for external things,
  • This is why we must constantly grow, because we (deep down and often implicitly) believe that we are headed toward an (economic) paradise on Earth.
  • Only since the period of scientific-technological revolution (and at a time when economics was born as an independent field) is material progress automatically assumed.
  • Jewish thought is the most grounded, most realistic school of thought of all those that have influenced our culture.17 An abstract world of ideas was unknown to the Jews. To this day it is still forbidden to even depict God, people, and animals in symbols, paintings, statues, and drawings.
  • economists have become key figures of great importance in our time (Kacířské eseje o filosofii dějin [Heretical Essays in the Philosophy of History]). They are expected to perform interpretations of reality, give prophetic services (macroeconomic forecasts), reshape reality (mitigate the impacts of the crisis, speed up growth), and, in the long run, provide leadership on the way to the Promised Land—paradise on Earth.
  • REALISM AND ANTIASCETICISM Aside from ideas of progress, the Hebrews brought another very fundamental contribution to our culture: The desacralization of heroes, nature, and rulers.
  • Voltaire writes: “It certain fact is, that in his public laws he [Moses] never so much as once made mention of a life to come, limiting all punishments and all rewards to the present life.”21
  • As opposed to Christianity, the concept of an extraterrestrial paradise or heaven was not developed much in Hebrew thought.19 The paradise of the Israelites—Eden—was originally placed on Earth at a given place in Mesopotamia20 and at a given time,
  • The Hebrews consider the world to be real—not just a shadow reflection of a better world somewhere in the cloud of ideas, something the usual interpretation of history ascribes to Plato. The soul does not struggle against the body and is not its prisoner, as Augustine would write later.
  • The land, the world, the body, and material reality are for Jews the paramount setting for divine history, the pinnacle of creation. This idea is the conditio sine qua non of the development of economics, something of an utterly earthly making,
  • The mythology of the hero-king was strongly developed in that period, which Claire Lalouette summarizes into these basic characteristics: Beauty (a perfect face, on which it is “pleasant to look upon,” but also “beauty,” expressed in the Egyptian word nefer, not only means aesthetics, but contains moral qualities as well),
  • THE HERO AND HIS UNDEIFICATION: THE DREAM NEVER SLEEPS The concept of the hero is more important than it might appear. It may be the remote origin of Keynes’s animal spirits, or the desire to follow a kind of internal archetype that a given individual accepts as his own and that society values.
  • This internal animator of ours, our internal mover, this dream, never sleeps and it influences our behavior—including economic behavior—more than we want to realize.
  • manliness and strength,28 knowledge and intelligence,29 wisdom and understanding, vigilance and performance, fame and renown (fame which overcomes enemies because “a thousand men would not be able to stand firmly in his presence”);30 the hero is a good shepherd (who takes care of his subordinates), is a copper-clad rampart, the shield of the land, and the defender of heroes.
  • Each of us probably has a sort of “hero within”—a kind of internal role-model, template, an example that we (knowingly or not) follow. It is very important what kind of archetype it is, because its role is dominantly irrational and changes depending on time and the given civilization.
  • The oldest was the so-called Trickster—a fraudster; then the culture bearer—Rabbit; the musclebound hero called Redhorn; and finally the most developed form of hero: the Twins.
  • the Egyptian ruler, just as the Sumerian, was partly a god, or the son of a god.31
  • Jacob defrauds his father Isaac and steals his brother Esau’s blessing of the firstborn. Moses murders an Egyptian. King David seduces the wife of his military commander and then has him killed. In his old age, King Solomon turns to pagan idols, and so on.
  • Anthropology knows several archetypes of heroes. The Polish-born American anthropologist Paul Radin examined the myths of North American Indians and, for example, in his most influential book, The Trickster, he describes their four basic archetypes of heroes.
  • The Torah’s heroes (if that term can be used at all) frequently make mistakes and their mistakes are carefully recorded in the Bible—maybe precisely so that none of them could be deified.32
  • We do not have to go far for examples. Noah gets so drunk he becomes a disgrace; Lot lets his own daughters seduce him in a similar state of drunkenness. Abraham lies and (repeatedly) tries to sell his wife as a concubine.
  • the Hebrew heroes correspond most to the Tricksters, the Culture Bearers, and the Twins. The divine muscleman, that dominant symbol we think of when we say hero, is absent here.
  • To a certain extent it can be said that the Hebrews—and later Christianity—added another archetype, the archetype of the heroic Sufferer.35 Job
  • Undeification, however, does not mean a call to pillage or desecration; man was put here to take care of nature (see the story of the Garden of Eden or the symbolism of the naming of the animals). This protection and care of nature is also related to the idea of progress
  • For the heroes who moved our civilization to where it is today, the heroic archetypes of the cunning trickster, culture bearer, and sufferer are rather more appropriate.
  • the Old Testament strongly emphasizes the undeification of nature.37 Nature is God’s creation, which speaks of divinity but is not the domain of moody gods
  • This is very important for democratic capitalism, because the Jewish heroic archetype lays the groundwork much better for the development of the later phenomenon of the hero, which better suits life as we know it today. “The heroes laid down their arms and set about trading to become wealthy.”
  • in an Old Testament context, the pharaoh was a mere man (whom one could disagree with, and who could be resisted!).
  • RULERS ARE MERE MEN In a similar historical context, the Old Testament teachings carried out a similar desacralization of rulers, the so-called bearers of economic policy.
  • Ultimately the entire idea of a political ruler stood against the Lord’s will, which is explicitly presented in the Torah. The Lord unequivocally preferred the judge as the highest form of rule—an
  • The needs of future generations will have to be considered; after all humankind are the guardians of God’s world. Waste of natural resources, whether privately owned or nationally owned is forbidden.”39
  • Politics lost its character of divine infallibility, and political issues were subject to questioning. Economic policy could become a subject of examination.
  • 44 God first creates with the word and then on individual days He divides light from darkness, water from dry land, day from night, and so forth—and He gives order to things.45 The world is created orderly— it is wisely, reasonably put together. The way of the world is put together at least partially46 decipherably by any other wise and reasonable being who honors rational rules.
  • which for the methodology of science and economics is very important because disorder and chaos are difficult to examine scientifically.43 Faith in some kind of rational and logical order in a system (society, the economy) is a silent assumption of any (economic) examination.
  • THE PRAISE OF ORDER AND WISDOM: MAN AS A PERFECTER OF CREATION The created world has an order of sorts, an order recognizable by us as people,
  • From the very beginning, when God distances Himself from the entire idea, there is an anticipation that there is nothing holy, let alone divine, in politics. Rulers make mistakes, and it is possible to subject them to tough criticism—which frequently occurs indiscriminately through the prophets in the Old Testament.
  • Hebrew culture laid the foundations for the scientific examination of the world.
  • Examining the world is therefore an absolutely legitimate activity, and one that is even requested by God—it is a kind of participation in the Creator’s work.51 Man is called on to understand himself and his surroundings and to use his knowledge for good.
  • I was there when he set heavens in place, when he marked out the horizon on the face of the deep (…) Then I was the craftsman at his side.47
  • There are more urgings to gain wisdom in the Old Testament. “Wisdom calls aloud in the street (…): ‘How long will you simple ones love your simple ways?’”49 Or several chapters later: “Wisdom is supreme; therefore get wisdom. Though it cost all you have, get understanding.”50
  • examination is not forbidden. The fact that order can be grasped by human reason is another unspoken assumption that serves as a cornerstone of any scientific examination.
  • then, my sons, listen to me; blessed are those who keep my ways (…) Blessed is the man who listens to me, watching daily at my doors, waiting at my doorway. For whoever finds me finds life and receives favor from the Lord.
  • the rational examination of nature has its roots, surprisingly, in religion.
  • The Lord brought me forth as the first of his works, before his deeds of old. I was appointed from eternity, from the beginning, before the world began. When there were no oceans, I was given birth, when there were no springs abounding with water, before the mountains were settled in place,
  • The Book of Proverbs emphasizes specifically several times that it was wisdom that was present at the creation of the world. Wisdom personified calls out:
  • The last act, final stroke of the brush of creation, naming of the animals—this act is given to a human, it is not done by God, as one would expect. Man was given the task of completing the act of creation that the Lord began:
  • MAN AS A FINISHER OF CREATION The creation of the world, as it is explained in Jewish teachings, is described in the Book of Genesis. Here God (i) creates, (ii) separates, and (iii) names [my emphasis]:
  • Naming is a symbolic expression. In Jewish culture (and also in our culture to this day), the right to name meant sovereign rights and belonged, for example, to explorers (new places), inventors (new principles), or parents (children)—that is, to those who were there at the genesis, at the origin. This right was handed over by God to mankind.
  • The Naming itself (the capital N is appropriate) traditionally belongs to the crowning act of the Creator and represents a kind of grand finale of creation, the last move of the brush to complete the picture—a signature of the master.
  • Without naming, reality does not exist; it is created together with language. Wittgenstein tightly names this in his tractatus—the limits of our language are the limits of our world.53
  • He invented (fictitiously and completely abstractly!) a framework that was generally accepted and soon “made into” reality. Marx invented similarly; he created the notion of class exploitation. Through his idea, the perception of history and reality was changed for a large part of the world for nearly an entire century.
  • Reality is not a given; it is not passive. Perceiving reality and “facts” requires man’s active participation. It is man who must take the last step, an act (and we
  • How does this relate to economics? Reality itself, our “objective” world, is cocreated, man himself participates in the creation; creation, which is somewhat constantly being re-created.
  • Our scientific models put the finishing touches on reality, because (1) they interpret, (2) they give phenomena a name, (3) they enable us to classify the world and phenomena according to logical forms, and (4) through these models we de facto perceive reality.
  • When man finds a new linguistic framework or analytical model, or stops using the old one, he molds or remolds reality. Models are only in our heads; they are not “in objective reality.” In this sense, Newton invented (not merely discovered!) gravity.
  • A real-ization act on our part represents the creation of a construct, the imputation of sense and order (which is beautifully expressed by the biblical act of naming, or categorization, sorting, ordering).
  • Keynes enters into the history of economic thought from the same intellectual cadence; his greatest contribution to economics was precisely the resurrection of the imperceptible—for example in the form of animal spirits or uncertainty. The economist Piero Mini even ascribes Keynes’s doubting and rebellious approach to his almost Talmudic education.63
  • God connects man with the task of guarding and protecting the Garden of Eden, and thus man actually cocreates the cultural landscape. The Czech philosopher Zdeněk Neubauer also describes this: “Such is reality, and it is so deep that it willingly crystallizes into worlds. Therefore I profess that reality is a creation and not a place of occurrence for objectively given phenomena.”61
  • in this viewpoint it is possible to see how Jewish thought is mystical—it admits the role of the incomprehensible. Therefore, through its groundedness, Jewish thought indulges mystery and defends itself against a mechanistic-causal explanation of the world: “The Jewish way of thinking, according to Veblen, emphasizes the spiritual, the miraculous, the intangible.
  • The Jews believed the exact opposite. The world is created by a good God, and evil appears in it as a result of immoral human acts. Evil, therefore, is induced by man.66 History unwinds according to the morality of human acts.
  • What’s more, history seems to be based on morals; morals seem to be the key determining factors of history. For the Hebrews, history proceeds according to how morally its actors behave.
  • The Sumerians believed in dualism—good and evil deities exist, and the earth of people becomes their passive battlefield.
  • GOOD AND EVIL IN US: A MORAL EXPLANATION OF WELL-BEING We have seen that in the Epic of Gilgamesh, good and evil are not yet addressed systematically on a moral level.
  • This was not about moral-human evil, but rather a kind of natural evil. It is as if good and evil were not touched by morality at all. Evil simply occurred. Period.
  • the epic, good and evil are not envisaged morally—they are not the result of an (a)moral act. Evil was not associated with free moral action or individual will.
  • Hebrew thought, on the other hand, deals intensively with moral good and evil. A moral dimension touches the core of its stories.65
  • discrepancy between savings and investment, and others are convinced of the monetary essence
  • The entire history of the Jewish nation is interpreted and perceived in terms of morality. Morality has become, so to speak, a mover and shaker of Hebrew history.
  • sunspots. The Hebrews came up with the idea that morals were behind good and bad years, behind the economic cycle. But we would be getting ahead of ourselves. Pharaoh’s Dream: Joseph and the First Business Cycle To
  • It is the Pharaoh’s well-known dream of seven fat and seven lean cows, which he told to Joseph, the son of Jacob. Joseph interpreted the dream as a macroeconomic prediction of sorts: Seven years of abundance were to be followed by seven years of poverty, famine, and misery.
  • Self-Contradicting Prophecy Here, let’s make several observations on this: Through taxation74 on the level of one-fifth of a crop75 in good years to save the crop and then open granaries in bad years, the prophecy was de facto prevented (prosperous years were limited and hunger averted—through a predecessor of fiscal stabilization).
  • The Old Testament prophesies therefore were not any deterministic look into the future, but warnings and strategic variations of the possible, which demanded some kind of reaction. If the reaction was adequate, what was prophesied would frequently not occur at all.
  • This principle stands directly against the self-fulfilling prophecy,80 the well-known concept of social science. Certain prophecies become self-fulfilling when expressed (and believed) while others become self-contradicting prophecies when pronounced (and believed).
  • If the threat is anticipated, it is possible to totally or at least partially avoid it. Neither Joseph nor the pharaoh had the power to avoid bounty or crop failure (in this the dream interpretation was true and the appearance of the future mystical), but they avoided the impacts and implications of the prophecy (in this the interpretation of the dream was “false”)—famine did not ultimately occur in Egypt, and this was due to the application of reasonable and very intuitive economic policy.
  • Let us further note that the first “macroeconomic forecast” appears in a dream.
  • back to Torah: Later in this story we will notice that there is no reason offered as to why the cycle occurs (that will come later). Fat years will simply come, and then lean years after them.
  • Moral Explanation of a Business Cycle That is fundamentally different from later Hebrew interpretations, when the Jewish nation tries to offer reasons why the nation fared well or poorly. And those reasons are moral.
  • If you pay attention to these laws and are careful to follow them, then the Lord your God will keep his covenant of love with you, as he swore to your forefathers. He will love you and bless you and increase your numbers.
  • Only in recent times have some currents of economics again become aware of the importance of morals and trust in the form of measuring the quality of institutions, the level of justice, business ethics, corruption, and so forth, and examining their influence on the economy,
  • From today’s perspective, we can state that the moral dimension entirely disappeared from economic thought for a long time, especially due to the implementation of Mandeville’s concept of private vices that contrarily support the public welfare
  • Without being timid, we can say this is the first documented attempt to explain the economic cycle. The economic cycle, the explanation of which is to this day a mystery to economists, is explained morally in the Old Testament.
  • But how do we consolidate these two conflicting interpretations of the economic cycle: Can ethics be responsible for it or not? Can we influence reality around us through our acts?
  • it is not within the scope of this book to answer that question; justice has been done to the question if it manages to sketch out the main contours of possible searches for answers.
  • THE ECONOMICS OF GOOD AND EVIL: DOES GOOD PAY OFF? This is probably the most difficult moral problem we could ask.
  • Kant, the most important modern thinker in the area of ethics, answers on the contrary that if we carry out a “moral” act on the basis of economic calculus (therefore we carry out an hedonistic consideration; see below) in the expectation of later recompense, its morality is lost. Recompense, according to the strict Kant, annuls ethics.
  • Inquiring about the economics of good and evil, however, is not that easy. Where would Kant’s “moral dimension of ethics” go if ethics paid? If we do good for profit, the question of ethics becomes a mere question of rationality.
  • Job’s friends try to show that he must have sinned in some way and, in doing so, deserved God’s punishment. They are absolutely unable to imagine a situation in which Job, as a righteous man, would suffer without (moral) cause. Nevertheless, Job insists that he deserves no punishment because he has committed no offense: “God has wronged me and drawn his net around me.”94
  • But Job remains righteous, even though it does not pay to do so: Though he slay me, yet will I hope in him.95 And till I die, I will not deny my integrity I will maintain my righteousness and never let go of it; my conscience will not reproach me as long as I live.96
  • He remains righteous, even if his only reward is death. What economic advantage could he have from that?
  • morals cannot be considered in the economic dimension of productivity and calculus. The role of the Hebrews was to do good, whether it paid off or not. If good (outgoing) is rewarded by incoming goodness, it is a bonus,99 not a reason to do outgoing good. Good and reward do not correlate to each other.
  • This reasoning takes on a dimension of its own in the Old Testament. Good (incoming) has already happened to us. We must do good (outgoing) out of gratitude for the good (incoming) shown to us in the past.
  • So why do good? After all, suffering is the fate of many biblical figures. The answer can only be: For good itself. Good has the power to be its own reward. In this sense, goodness gets its reward, which may or may not take on a material dimension.
  • the Hebrews offered an interesting compromise between the teachings of the Stoics and Epicureans. We will go into it in detail later, so only briefly
  • constraint. It calls for bounded optimalization (with limits). A kind of symbiosis existed between the legitimate search for one’s own utility (or enjoyment of life) and maintaining rules, which are not negotiable and which are not subject to optimalization.
  • In other words, clear (exogenously given) rules exist that must be observed and cannot be contravened. But within these borders it is absolutely possible, and even recommended, to increase utility.
  • the mining of enjoyment must not come at the expense of exogenously given rules. “Judaism comes therefore to train or educate the unbounded desire … for wealth, so that market activities and patterns of consumption operate within a God-given morality.”102
  • The Epicureans acted with the goal of maximizing utility without regard for rules (rules developed endogenously, from within the system, computed from that which increased utility—this was one of the main trumps of the Epicurean school; they did not need exogenously given norms, and argued that they could “calculate” ethics (what to do) for every given situation from the situation itself).
  • The Stoics could not seek their enjoyment—or, by another name, utility. They could not in any way look back on it, and in no way could they count on it. They could only live according to rules (the greatest weakness of this school was to defend where exogenously the given rules came from and whether they are universal) and take a indifferent stand to the results of their actions.
  • To Love the Law The Jews not only had to observe the law (perhaps the word covenant would be more appropriate), but they were to love it because it was good.
  • Their relationship to the law was not supposed to be one of duty,105 but one of gratitude, love. Hebrews were to do good (outgoing), because goodness (incoming) has already been done to them.
  • This is in stark contrast with today’s legal system, where, naturally, no mention of love or gratefulness exists. But God expects a full internalization of the commandments and their fulfillment with love, not as much duty. By no means was this on the basis of the cost-benefit analyses so widespread in economics today, which determines when it pays to break the law and when not to (calculated on the basis of probability of being caught and the amount of punishment vis-à-vis the possible gain).
  • And now, O Israel, what does the Lord your God ask of you but to fear the Lord your God, to walk in all his ways, to love him, to serve the Lord your God with all your heart and with all your soul, and to observe the Lord’s commands and decrees that I am giving you today for your own good? To the Lord your God belong the heavens, even the highest heavens, the earth and everything in it. Yet the Lord set his affection on your forefathers and loved them….
  • the principle of doing good (outgoing) on the basis of a priori demonstrated good (incoming) was also taken over by the New Testament. Atonement itself is based on an a priori principle; all our acts are preceded by good.
  • The Hebrews, originally a nomadic tribe, preferred to be unrestrained and grew up in constant freedom of motion.
  • Human laws, if they are in conflict with the responsibilities given by God, are subordinate to personal responsibility, and a Jew cannot simply join the majority, even if it is legally allowed. Ethics, the concept of good, is therefore always superior to all local laws, rules, and customs:
  • THE SHACKLES OF THE CITY Owing to the Hebrew’s liberation from Egyptian slavery, freedom and responsibility become the key values of Jewish thought.
  • Laws given by God are binding for Jews, and God is the absolute source of all values,
  • The Hebrew ideal is represented by the paradise of the Garden of Eden, not a city.116 The despised city civilization or the tendency to see in it a sinful and shackling way of life appears in glimpses and allusions in many places in the Old Testament.
  • The nomadic Jewish ethos is frequently derived from Abraham, who left the Chaldean city of Ur on the basis of a command:
  • In addition, they were aware of a thin two-way line between owner and owned. We own material assets, but—to a certain extent—they own us and tie us down. Once we become used to a certain material
  • This way of life had understandably immense economic impacts. First, such a society lived in much more connected relationships, where there was no doubt that everyone mutually depended on each other. Second, their frequent wanderings meant the inability to own more than they could carry; the gathering up of material assets did not have great weight—precisely because the physical weight (mass) of things was tied to one place.
  • One of Moses’s greatest deeds was that he managed to explain to his nation once and for all that it is better to remain hungry and liberated than to be a slave with food “at no cost.”
  • SOCIAL WELFARE: NOT TO ACT IN THE MANNER OF SODOM
  • regulations is developed in the Old Testament, one we hardly find in any other nation of the time. In Hebrew teachings, aside from individual utility, indications of the concept of maximalizing utility societywide appear for the first time as embodied in the Talmudic principle of Kofin al midat S´dom, which can be translated as “one is compelled not to act in the manner of Sodom” and to take care of the weaker members of society.
  • In a jubilee year, debts were to be forgiven,125 and Israelites who fell into slavery due to their indebtedness were to be set free.126
  • Such provisions can be seen as the antimonopoly and social measures of the time. The economic system even then had a clear tendency to converge toward asset concentration, and therefore power as well. It would appear that these provisions were supposed to prevent this process
  • Land at the time could be “sold,” and it was not sale, but rent. The price (rent) of real estate depended on how long there was until a forgiveness year. It was about the awareness that we may work the land, but in the last instance we are merely “aliens and strangers,” who have the land only rented to us for a fixed time. All land and riches came from the Lord.
  • These provisions express a conviction that freedom and inheritance should not be permanently taken away from any Israelite. Last but not least, this system reminds us that no ownership lasts forever and that the fields we plow are not ours but the Lord’s.
  • Glean Another social provision was the right to glean, which in Old Testament times ensured at least basic sustenance for the poorest. Anyone who owned a field had the responsibility not to harvest it to the last grain but to leave the remains in the field for the poor.
  • Tithes and Early Social Net Every Israelite also had the responsibility of levying a tithe from their entire crop. They had to be aware from whom all ownership comes and, by doing so, express their thanks.
  • “Since the community has an obligation to provide food, shelter, and basic economic goods for the needy, it has a moral right and duty to tax its members for this purpose. In line with this duty, it may have to regulate markets, prices and competition, to protect the interests of its weakest members.”135
  • In Judaism, charity is not perceived as a sign of goodness; it is more of a responsibility. Such a society then has the right to regulate its economy in such a way that the responsibility of charity is carried out to its satisfaction.
  • With a number of responsibilities, however, comes the difficulty of getting them into practice. Their fulfillment, then, in cases when it can be done, takes place gradually “in layers.” Charitable activities are classified in the Talmud according to several target groups with various priorities, classified according to, it could be said, rules of subsidiarity.
  • Do not mistreat an alien or oppress him, for you were aliens in Egypt.140 As one can see, aside from widows and orphans, the Old Testament also includes immigrants in its area of social protection.141 The Israelites had to have the same rules apply for them as for themselves—they could not discriminate on the basis of their origin.
  • ABSTRACT MONEY, FORBIDDEN INTEREST, AND OUR DEBT AGE If it appears to us that today’s era is based on money and debt, and our time will be written into history as the “Debt age,” then it will certainly be interesting to follow how this development occurred.
  • Money is a social abstractum. It is a social agreement, an unwritten contract.
  • The first money came in the form of clay tablets from Mesopotamia, on which debts were written. These debts were transferable, so the debts became currency. In the end, “It is no coincidence that in English the root of ‘credit’ is ‘credo,’ the Latin for ‘I believe.’”
  • To a certain extent it could be said that credit, or trust, was the first currency. It can materialize, it can be embodied in coins, but what is certain is that “money is not metal,” even the rarest metal, “it is trust inscribed,”
  • Inseparably, with the original credit (money) goes interest. For the Hebrews, the problem of interest was a social issue: “If you lend money to one of my people among you who is needy, do not be like a moneylender; charge him no interest.”
  • there were also clearly set rules setting how far one could go in setting guarantees and the nonpayment of debts. No one should become indebted to the extent that they could lose the source of their livelihood:
  • In the end, the term “bank” comes from the Italian banci, or the benches that Jewish lenders sat on.157
  • Money is playing not only its classical roles (as a means of exchange, a holder of value, etc.) but also a much greater, stronger role: It can stimulate, drive (or slow down) the whole economy. Money plays a national economic role.
  • In the course of history, however, the role of loans changed, and the rich borrowed especially for investment purposes,
  • Today the position and significance of money and debt has gone so far and reached such a dominant position in society that operating with debts (fiscal policy) or interest or money supply (monetary policy) means that these can, to a certain extent, direct (or at least strongly influence) the whole economy and society.
  • In such a case a ban on interest did not have great ethical significance. Thomas Aquinas, a medieval scholar (1225-1274), also considers similarly; in his time, the strict ban on lending with usurious interest was loosened, possibly due to him.
  • As a form of energy, money can travel in three dimensions, vertically (those who have capital lend to those who do not) and horizontally (speed and freedom in horizontal or geographic motion has become the by-product—or driving force?—of globalization). But money (as opposed to people) can also travel through time.
  • money is something like energy that can travel through time. And it is a very useful energy, but at the same time very dangerous as well. Wherever
  • Aristotle condemned interest162 not only from a moral standpoint, but also for metaphysical reasons. Thomas Aquinas shared the same fear of interest and he too argued that time does not belong to us, and that is why we must not require interest.
  • MONEY AS ENERGY: TIME TRAVEL AND GROSS DEBT PRODUCT (GDP)
  • Due to this characteristic, we can energy-strip the future to the benefit of the present. Debt can transfer energy from the future to the present.163 On the other hand, saving can accumulate energy from the past and send it to the present.
  • labor was not considered degrading in the Old Testament. On the contrary, the subjugation of nature is even a mission from God that originally belonged to man’s very first blessings.
  • LABOR AND REST: THE SABBATH ECONOMY
  • The Jews as well as Aristotle behaved very guardedly toward loans. The issue of interest/usury became one of the first economic debates. Without having an inkling of the future role of economic policy (fiscal and monetary), the ancient Hebrews may have unwittingly felt that they were discovering in interest a very powerful weapon, one that can be a good servant, but (literally) an enslaving master as well.
  • It’s something like a dam. When we build one, we are preventing periods of drought and flooding in the valley; we are limiting nature’s whims and, to a large extent, avoiding its incalculable cycles. Using dams, we can regulate the flow of water to nearly a constant. With it we tame the river (and we can also gain
  • But if we do not regulate the water wisely, it may happen that we would overfill the dam and it would break. For the cities lying in the valley, their end would be worse than if a dam were never there.
  • If man lived in harmony with nature before, now, after the fall, he must fight; nature stands against him and he against it and the animals. From the Garden we have moved unto a (battle)field.
  • Only after man’s fall does labor turn into a curse.168 It could even be said that this is actually the only curse, the curse of the unpleasantness of labor, that the Lord places on Adam.
  • Both Plato and Aristotle consider labor to be necessary for survival, but that only the lower classes should devote themselves to it so that the elites would not have to be bothered with it and so that they could devote themselves to “purely spiritual matters—art, philosophy, and politics.”
  • Work is also not only a source of pleasure but a social standing; It is considered an honor. “Do you see a man skilled in his work? He will serve before kings.”170 None of the surrounding cultures appreciate work as much. The idea of the dignity of labor is unique in the Hebrew tradition.
  • Hebrew thinking is characterized by a strict separation of the sacred from the profane. In life, there are simply areas that are holy, and in which it is not allowed to economize, rationalize, or maximize efficiency.
  • good example is the commandment on the Sabbath. No one at all could work on this day, not even the ones who were subordinate to an observant Jew:
  • the message of the commandment on Saturday communicated that people were not primarily created for labor.
  • Paradoxically, it is precisely this commandment out of all ten that is probably the most violated today.
  • Aristotle even considers labor to be “a corrupted waste of time which only burdens people’s path to true honour.”
  • we have days when we must not toil connected (at least lexically) with the word meaning emptiness: the English term “vacation” (or emptying), as with the French term, les vacances, or German die Freizeit, meaning open time, free time, but also…
  • Translated into economic language: The meaning of utility is not to increase it permanently but to rest among existing gains. Why do we learn how to constantly increase gains but not how to…
  • This dimension has disappeared from today’s economics. Economic effort has no goal at which it would be possible to rest. Today we only know growth for growth’s sake, and if our company or country prospers, that does not…
  • Six-sevenths of time either be dissatisfied and reshape the world into your own image, man, but one-seventh you will rest and not change the creation. On the seventh day, enjoy creation and enjoy the work of your hands.
  • the purpose of creation was not just creating but that it had an end, a goal. The process was just a process, not a purpose. The whole of Being was created so…
  • Saturday was not established to increase efficiency. It was a real ontological break that followed the example of the Lord’s seventh day of creation. Just as the Lord did not rest due to tiredness or to regenerate strength; but because He was done. He was done with His work, so that He could enjoy it, to cherish in His creation.
  • If we believe in rest at all today, it is for different reasons. It is the rest of the exhausted machine, the rest of the weak, and the rest of those who can’t handle the tempo. It’s no wonder that the word “rest…
  • Related to this, we have studied the first mention of a business cycle with the pharaoh’s dream as well as seen a first attempt (that we may call…
  • We have tried to show that the quest for a heaven on Earth (similar to the Jewish one) has, in its desacralized form, actually also been the same quest for many of the…
  • We have also seen that the Hebrews tried to explain the business cycle with morality and ethics. For the Hebrews,…
  • ancient Greek economic ethos, we will examine two extreme approaches to laws and rules. While the Stoics considered laws to be absolutely valid, and utility had infinitesimal meaning in their philosophy, the Epicureans, at least in the usual historical explanation, placed utility and pleasure in first place—rules were to be made based on the principle of utility.
  • CONCLUSION: BETWEEN UTILITY AND PRINCIPLE The influence of Jewish thought on the development of market democracy cannot be overestimated. The key heritage for us was the lack of ascetic perception of the world, respect to law and private…
  • We have tried to show how the Torah desacralized three important areas in our lives: the earthly ruler, nature,…
  • What is the relationship between the good and evil that we do (outgoing) and the utility of disutility that we (expect to) get as a reward (incoming)? We have seen…
  • The Hebrews never despised material wealth; on contrary, the Jewish faith puts great responsibility on property management. Also the idea of progress and the linear perception of time gives our (economic)…
  • the Hebrews managed to find something of a happy compromise between both of these principles.
  • will not be able to completely understand the development of the modern notion of economics without understanding the disputes between the Epicureans and the Stoics;
  • poets actually went even further, and with their speech they shaped and established reality and truth. Honor, adventure, great deeds, and the acclaim connected with them played an important role in the establishment of the true, the real.
  • those who are famous will be remembered by people. They become more real, part of the story, and they start to be “realized,” “made real” in the lives of other people. That which is stored in memory is real; that which is forgotten is as if it never existed.
  • Today’s scientific truth is founded on the notion of exact and objective facts, but poetic truth stands on an interior (emotional) consonance with the story or poem. “It is not addressed first to the brain … [myth] talks directly to the feeling system.”
  • “epic and tragic poets were widely assumed to be the central ethical thinkers and teachers of Greece; nobody thought of their work as less serious, less aimed at truth, than the speculative prose treatises of historians and philosophers.”5 Truth and reality were hidden in speech, stories, and narration.
  • Ancient philosophy, just as science would later, tries to find constancy, constants, quantities, inalterabilities. Science seeks (creates?) order and neglects everything else as much as it can. In their own experiences, everyone knows that life is not like that,
  • Just as scientists do today, artists drew images of the world that were representative, and therefore symbolic, picturelike, and simplifying (but thus also misleading), just like scientific models, which often do not strive to be “realistic.”
  • general? In the end, poetry could be more sensitive to the truth than the philosophical method or, later, the scientific method. “Tragic poems, in virtue of their subject matter and their social function, are likely to confront and explore problems about human beings and luck that a philosophical text might be able to omit or avoid.”8
Javier E

Losing Earth: The Decade We Almost Stopped Climate Change - The New York Times - 0 views

  • As Malcolm Forbes Baldwin, the acting chairman of the president’s Council for Environmental Quality, told industry executives in 1981, “There can be no more important or conservative concern than the protection of the globe itself.”
  • Among those who called for urgent, immediate and far-reaching climate policy were Senators John Chafee, Robert Stafford and David Durenberger; the E.P.A. administrator, William K. Reilly; and, during his campaign for president, George H.W. Bush.
  • It was understood that action would have to come immediately. At the start of the 1980s, scientists within the federal government predicted that conclusive evidence of warming would appear on the global temperature record by the end of the decade, at which point it would be too late to avoid disaster.
  • ...180 more annotations...
  • If the world had adopted the proposal widely endorsed at the end of the ’80s — a freezing of carbon emissions, with a reduction of 20 percent by 2005 — warming could have been held to less than 1.5 degrees.
  • Action had to be taken, and the United States would need to lead. It didn’t.
  • There can be no understanding of our current and future predicament without understanding why we failed to solve this problem when we had the chance.
  • The first suggestion to Rafe Pomerance that humankind was destroying the conditions necessary for its own survival came on Page 66 of the government publication EPA-600/7-78-019. It was a technical report about coal
  • ‘This Is the Whole Banana’ Spring 1979
  • here was an urgent problem that demanded their attention, MacDonald believed, because human civilization faced an existential crisis. In “How to Wreck the Environment,” a 1968 essay published while he was a science adviser to Lyndon Johnson, MacDonald predicted a near future in which “nuclear weapons were effectively banned and the weapons of mass destruction were those of environmental catastrophe.” One of the most potentially devastating such weapons, he believed, was the gas that we exhaled with every breath: carbon dioxide. By vastly increasing carbon emissions, the world’s most advanced militaries could alter weather patterns and wreak famine, drought and economic collapse.
  • the Jasons. They were like one of those teams of superheroes with complementary powers that join forces in times of galactic crisis. They had been brought together by federal agencies, including the C.I.A, to devise scientific solutions to national-security problems: how to detect an incoming missile; how to predict fallout from a nuclear bomb; how to develop unconventional weapons, like plague-infested rats.
  • Agle pointed to an article about a prominent geophysicist named Gordon MacDonald, who was conducting a study on climate change with the Jasons, the mysterious coterie of elite scientists to which he belonged
  • During the spring of 1977 and the summer of 1978, the Jasons met to determine what would happen once the concentration of carbon dioxide in the atmosphere doubled from pre-Industrial Revolution levels. It was an arbitrary milestone, the doubling, but a useful one, as its inevitability was not in question; the threshold would most likely be breached by 2035.
  • The Jasons’ report to the Department of Energy, “The Long-Term Impact of Atmospheric Carbon Dioxide on Climate,” was written in an understated tone that only enhanced its nightmarish findings: Global temperatures would increase by an average of two to three degrees Celsius; Dust Bowl conditions would “threaten large areas of North America, Asia and Africa”; access to drinking water and agricultural production would fall, triggering mass migration on an unprecedented scale. “Perhaps the most ominous feature,” however, was the effect of a changing climate on the poles. Even a minimal warming “could lead to rapid melting” of the West Antarctic ice sheet. The ice sheet contained enough water to raise the level of the oceans 16 feet.
  • MacDonald explained that he first studied the carbon-dioxide issue when he was about Pomerance’s age — in 1961, when he served as an adviser to John F. Kennedy. Pomerance pieced together that MacDonald, in his youth, had been something of a prodigy: In his 20s, he advised Dwight D. Eisenhower on space exploration; at 32, he became a member of the National Academy of Sciences; at 40, he was appointed to the inaugural Council on Environmental Quality, where he advised Richard Nixon on the environmental dangers of burning coal. He monitored the carbon-dioxide problem the whole time, with increasing alarm.
  • They were surprised to learn how few senior officials were familiar with the Jasons’ findings, let alone understood the ramifications of global warming. At last, having worked their way up the federal hierarchy, the two went to see the president’s top scientist, Frank Press.
  • Thus began the Gordon and Rafe carbon-dioxide roadshow. Beginning in the spring of 1979, Pomerance arranged informal briefings with the E.P.A., the National Security Council, The New York Times, the Council on Environmental Quality and the Energy Department, which, Pomerance learned, had established an Office of Carbon Dioxide Effects two years earlier at MacDonald’s urging
  • . Out of respect for MacDonald, Press had summoned to their meeting what seemed to be the entire senior staff of the president’s Office of Science and Technology Policy — the officials consulted on every critical matter of energy and national security. What Pomerance had expected to be yet another casual briefing assumed the character of a high-level national-security meeting.
  • MacDonald would begin his presentation by going back more than a century to John Tyndall — an Irish physicist who was an early champion of Charles Darwin’s work and died after being accidentally poisoned by his wife. In 1859, Tyndall found that carbon dioxide absorbed heat and that variations in the composition of the atmosphere could create changes in climate. These findings inspired Svante Arrhenius, a Swedish chemist and future Nobel laureate, to deduce in 1896 that the combustion of coal and petroleum could raise global temperatures. This warming would become noticeable in a few centuries, Arrhenius calculated, or sooner if consumption of fossil fuels continued to increase.
  • Four decades later, a British steam engineer named Guy Stewart Callendar discovered that, at the weather stations he observed, the previous five years were the hottest in recorded history. Humankind, he wrote in a paper, had become “able to speed up the processes of Nature.” That was in 1939.
  • MacDonald’s history concluded with Roger Revelle, perhaps the most distinguished of the priestly caste of government scientists who, since the Manhattan Project, advised every president on major policy; he had been a close colleague of MacDonald and Press since they served together under Kennedy. In a 1957 paper written with Hans Suess, Revelle concluded that “human beings are now carrying out a large-scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future.” Revelle helped the Weather Bureau establish a continuous measurement of atmospheric carbon dioxide at a site perched near the summit of Mauna Loa on the Big Island of Hawaii, 11,500 feet above the sea — a rare pristine natural laboratory on a planet blanketed by fossil-fuel emissions.
  • After nearly a decade of observation, Revelle had shared his concerns with Lyndon Johnson, who included them in a special message to Congress two weeks after his inauguration. Johnson explained that his generation had “altered the composition of the atmosphere on a global scale” through the burning of fossil fuels, and his administration commissioned a study of the subject by his Science Advisory Committee. Revelle was its chairman, and its 1965 executive report on carbon dioxide warned of the rapid melting of Antarctica, rising seas, increased acidity of fresh waters — changes that would require no less than a coordinated global effort to forestall.Yet emissions continued to rise, and at this rate, MacDonald warned, they could see a snowless New England, the swamping of major coastal cities, as much as a 40 percent decline in national wheat production, the forced migration of about one-quarter of the world’s population. Not within centuries — within their own lifetimes.
  • On May 22, Press wrote a letter to the president of the National Academy of Sciences requesting a full assessment of the carbon-dioxide issue. Jule Charney, the father of modern meteorology, would gather the nation’s top oceanographers, atmospheric scientists and climate modelers to judge whether MacDonald’s alarm was justified — whether the world was, in fact, headed to cataclysm.
  • If Charney’s group confirmed that the world was careering toward an existential crisis, the president would be forced to act.
  • Hansen turned from the moon to Venus. Why, he tried to determine, was its surface so hot? In 1967, a Soviet satellite beamed back the answer: The planet’s atmosphere was mainly carbon dioxide. Though once it may have had habitable temperatures, it was believed to have succumbed to a runaway greenhouse effect: As the sun grew brighter, Venus’s ocean began to evaporate, thickening the atmosphere, which forced yet greater evaporation — a self-perpetuating cycle that finally boiled off the ocean entirely and heated the planet’s surface to more than 800 degrees Fahrenheit
  • At the other extreme, Mars’s thin atmosphere had insufficient carbon dioxide to trap much heat at all, leaving it about 900 degrees colder. Earth lay in the middle, its Goldilocks greenhouse effect just strong enough to support life.
  • We want to learn more about Earth’s climate, Jim told Anniek — and how humanity can influence it. He would use giant new supercomputers to map the planet’s atmosphere. They would create Mirror Worlds: parallel realities that mimicked our own. These digital simulacra, technically called “general circulation models,” combined the mathematical formulas that governed the behavior of the sea, land and sky into a single computer model. Unlike the real world, they could be sped forward to reveal the future.
  • The government officials, many of them scientists themselves, tried to suppress their awe of the legends in their presence: Henry Stommel, the world’s leading oceanographer; his protégé, Carl Wunsch, a Jason; the Manhattan Project alumnus Cecil Leith; the Harvard planetary physicist Richard Goody. These were the men who, in the last three decades, had discovered foundational principles underlying the relationships among sun, atmosphere, land and ocean — which is to say, the climate.
  • When, at Charney’s request, Hansen programmed his model to consider a future of doubled carbon dioxide, it predicted a temperature increase of four degrees Celsius. That was twice as much warming as the prediction made by the most prominent climate modeler, Syukuro Manabe, whose government lab at Princeton was the first to model the greenhouse effect. The difference between the two predictions — between warming of two degrees Celsius and four degrees Celsius — was the difference between damaged coral reefs and no reefs whatsoever, between thinning forests and forests enveloped by desert, between catastrophe and chaos.
  • The discrepancy between the models, Arakawa concluded, came down to ice and snow. The whiteness of the world’s snowfields reflected light; if snow melted in a warmer climate, less radiation would escape the atmosphere, leading to even greater warming. Shortly before dawn, Arakawa concluded that Manabe had given too little weight to the influence of melting sea ice, while Hansen had overemphasized it. The best estimate lay in between. Which meant that the Jasons’ calculation was too optimistic. When carbon dioxide doubled in 2035 or thereabouts, global temperatures would increase between 1.5 and 4.5 degrees Celsius, with the most likely outcome a warming of three degrees.
  • within the highest levels of the federal government, the scientific community and the oil-and-gas industry — within the commonwealth of people who had begun to concern themselves with the future habitability of the planet — the Charney report would come to have the authority of settled fact. It was the summation of all the predictions that had come before, and it would withstand the scrutiny of the decades that followed it. Charney’s group had considered everything known about ocean, sun, sea, air and fossil fuels and had distilled it to a single number: three. When the doubling threshold was broached, as appeared inevitable, the world would warm three degrees Celsius
  • The last time the world was three degrees warmer was during the Pliocene, three million years ago, when beech trees grew in Antarctica, the seas were 80 feet higher and horses galloped across the Canadian coast of the Arctic Ocean.
  • After the publication of the Charney report, Exxon decided to create its own dedicated carbon-dioxide research program, with an annual budget of $600,000. Only Exxon was asking a slightly different question than Jule Charney. Exxon didn’t concern itself primarily with how much the world would warm. It wanted to know how much of the warming Exxon could be blamed for.
  • “It behooves us to start a very aggressive defensive program,” Shaw wrote in a memo to a manager, “because there is a good probability that legislation affecting our business will be passed.”
  • Shaw turned to Wallace Broecker, a Columbia University oceanographer who was the second author of Roger Revelle’s 1965 carbon-dioxide report for Lyndon Johnson. In 1977, in a presentation at the American Geophysical Union, Broecker predicted that fossil fuels would have to be restricted, whether by taxation or fiat. More recently, he had testified before Congress, calling carbon dioxide “the No.1 long-term environmental problem.” If presidents and senators trusted Broecker to tell them the bad news, he was good enough for Exxon.
  • The company had been studying the carbon-dioxide problem for decades, since before it changed its name to Exxon. In 1957, scientists from Humble Oil published a study tracking “the enormous quantity of carbon dioxide” contributed to the atmosphere since the Industrial Revolution “from the combustion of fossil fuels.” Even then, the observation that burning fossil fuels had increased the concentration of carbon in the atmosphere was well understood and accepted by Humble’s scientists.
  • The American Petroleum Institute, the industry’s largest trade association, asked the same question in 1958 through its air-pollution study group and replicated the findings made by Humble Oil. So did another A.P.I. study conducted by the Stanford Research Institute a decade later, in 1968, which concluded that the burning of fossil fuels would bring “significant temperature changes” by the year 2000 and ultimately “serious worldwide environmental changes,” including the melting of the Antarctic ice cap and rising seas.
  • The ritual repeated itself every few years. Industry scientists, at the behest of their corporate bosses, reviewed the problem and found good reasons for alarm and better excuses to do nothing. Why should they act when almost nobody within the United States government — nor, for that matter, within the environmental movement — seemed worried?
  • Why take on an intractable problem that would not be detected until this generation of employees was safely retired? Worse, the solutions seemed more punitive than the problem itself. Historically, energy use had correlated to economic growth — the more fossil fuels we burned, the better our lives became. Why mess with that?
  • That June, Jimmy Carter signed the Energy Security Act of 1980, which directed the National Academy of Sciences to start a multiyear, comprehensive study, to be called “Changing Climate,” that would analyze social and economic effects of climate change. More urgent, the National Commission on Air Quality, at the request of Congress, invited two dozen experts, including Henry Shaw himself, to a meeting in Florida to propose climate policy.
  • On April 3, 1980, Senator Paul Tsongas, a Massachusetts Democrat, held the first congressional hearing on carbon-dioxide buildup in the atmosphere. Gordon MacDonald testified that the United States should “take the initiative” and develop, through the United Nations, a way to coordinate every nation’s energy policies to address the problem.
  • During the expansion of the Clean Air Act, he pushed for the creation of the National Commission on Air Quality, charged with ensuring that the goals of the act were being met. One such goal was a stable global climate. The Charney report had made clear that goal was not being met, and now the commission wanted to hear proposals for legislation. It was a profound responsibility, and the two dozen experts invited to the Pink Palace — policy gurus, deep thinkers, an industry scientist and an environmental activist — had only three days to achieve it, but the utopian setting made everything seem possible
  • We have less time than we realize, said an M.I.T. nuclear engineer named David Rose, who studied how civilizations responded to large technological crises. “People leave their problems until the 11th hour, the 59th minute,” he said. “And then: ‘Eloi, Eloi, Lama Sabachthani?’ ” — “My God, my God, why hast thou forsaken me?”
  • The attendees seemed to share a sincere interest in finding solutions. They agreed that some kind of international treaty would ultimately be needed to keep atmospheric carbon dioxide at a safe level. But nobody could agree on what that level was.
  • William Elliott, a NOAA scientist, introduced some hard facts: If the United States stopped burning carbon that year, it would delay the arrival of the doubling threshold by only five years. If Western nations somehow managed to stabilize emissions, it would forestall the inevitable by only eight years. The only way to avoid the worst was to stop burning coal. Yet China, the Soviet Union and the United States, by far the world’s three largest coal producers, were frantically accelerating extraction.
  • “Do we have a problem?” asked Anthony Scoville, a congressional science consultant. “We do, but it is not the atmospheric problem. It is the political problem.” He doubted that any scientific report, no matter how ominous its predictions, would persuade politicians to act.
  • The talk of ending oil production stirred for the first time the gentleman from Exxon. “I think there is a transition period,” Henry Shaw said. “We are not going to stop burning fossil fuels and start looking toward solar or nuclear fusion and so on. We are going to have a very orderly transition from fossil fuels to renewable energy sources.”
  • What if the problem was that they were thinking of it as a problem? “What I am saying,” Scoville continued, “is that in a sense we are making a transition not only in energy but the economy as a whole.” Even if the coal and oil industries collapsed, renewable technologies like solar energy would take their place. Jimmy Carter was planning to invest $80 billion in synthetic fuel. “My God,” Scoville said, “with $80 billion, you could have a photovoltaics industry going that would obviate the need for synfuels forever!”
  • nobody could agree what to do. John Perry, a meteorologist who had worked as a staff member on the Charney report, suggested that American energy policy merely “take into account” the risks of global warming, though he acknowledged that a nonbinding measure might seem “intolerably stodgy.” “It is so weak,” Pomerance said, the air seeping out of him, “as to not get us anywhere.”
  • Scoville pointed out that the United States was responsible for the largest share of global carbon emissions. But not for long. “If we’re going to exercise leadership,” he said, “the opportunity is now.
  • One way to lead, he proposed, would be to classify carbon dioxide as a pollutant under the Clean Air Act and regulate it as such. This was received by the room like a belch. By Scoville’s logic, every sigh was an act of pollution. Did the science really support such an extreme measure? The Charney report did exactly that, Pomerance said.
  • Slade, the director of the Energy Department’s carbon-dioxide program, considered the lag a saving grace. If changes did not occur for a decade or more, he said, those in the room couldn’t be blamed for failing to prevent them. So what was the problem?
  • “Call it whatever.” Besides, Pomerance added, they didn’t have to ban coal tomorrow. A pair of modest steps could be taken immediately to show the world that the United States was serious: the implementation of a carbon tax and increased investment in renewable energy. Then the United States could organize an international summit meeting to address climate change
  • these two dozen experts, who agreed on the major points and had made a commitment to Congress, could not draft a single paragraph. Hours passed in a hell of fruitless negotiation, self-defeating proposals and impulsive speechifying. Pomerance and Scoville pushed to include a statement calling for the United States to “sharply accelerate international dialogue,” but they were sunk by objections and caveats.
  • They never got to policy proposals. They never got to the second paragraph. The final statement was signed by only the moderator, who phrased it more weakly than the declaration calling for the workshop in the first place. “The guide I would suggest,” Jorling wrote, “is whether we know enough not to recommend changes in existing policy.”
  • Pomerance had seen enough. A consensus-based strategy would not work — could not work — without American leadership. And the United States wouldn’t act unless a strong leader persuaded it to do so — someone who would speak with authority about the science, demand action from those in power and risk everything in pursuit of justice.
  • The meeting ended Friday morning. On Tuesday, four days later, Ronald Reagan was elected president.
  • ‘Otherwise, They’ll Gurgle’ November 1980-September 1981
  • In the midst of this carnage, the Council on Environmental Quality submitted a report to the White House warning that fossil fuels could “permanently and disastrously” alter Earth’s atmosphere, leading to “a warming of the Earth, possibly with very serious effects.” Reagan did not act on the council’s advice. Instead, his administration considered eliminating the council.
  • After the election, Reagan considered plans to close the Energy Department, increase coal production on federal land and deregulate surface coal mining. Once in office, he appointed James Watt, the president of a legal firm that fought to open public lands to mining and drilling, to run the Interior Department. “We’re deliriously happy,” the president of the National Coal Association was reported to have said. Reagan preserved the E.P.A. but named as its administrator Anne Gorsuch, an anti-regulation zealot who proceeded to cut the agency’s staff and budget by about a quarter
  • Reagan “has declared open war on solar energy,” the director of the nation’s lead solar-energy research agency said, after he was asked to resign). Reagan appeared determined to reverse the environmental achievements of Jimmy Carter, before undoing those of Richard Nixon, Lyndon Johnson, John F. Kennedy and, if he could get away with it, Theodore Roosevelt.
  • When Reagan considered closing the Council on Environmental Quality, its acting chairman, Malcolm Forbes Baldwin, wrote to the vice president and the White House chief of staff begging them to reconsider; in a major speech the same week, “A Conservative’s Program for the Environment,” Baldwin argued that it was “time for today’s conservatives explicitly to embrace environmentalism.” Environmental protection was not only good sense. It was good business. What could be more conservative than an efficient use of resources that led to fewer federal subsidies?
  • Meanwhile the Charney report continued to vibrate at the periphery of public consciousness. Its conclusions were confirmed by major studies from the Aspen Institute, the International Institute for Applied Systems Analysis near Vienna and the American Association for the Advancement of Science. Every month or so, nationally syndicated articles appeared summoning apocalypse: “Another Warning on ‘Greenhouse Effect,’ ” “Global Warming Trend ‘Beyond Human Experience,’ ” “Warming Trend Could ‘Pit Nation Against Nation.’
  • Pomerance read on the front page of The New York Times on Aug. 22, 1981, about a forthcoming paper in Science by a team of seven NASA scientists. They had found that the world had already warmed in the past century. Temperatures hadn’t increased beyond the range of historical averages, but the scientists predicted that the warming signal would emerge from the noise of routine weather fluctuations much sooner than previously expected. Most unusual of all, the paper ended with a policy recommendation: In the coming decades, the authors wrote, humankind should develop alternative sources of energy and use fossil fuels only “as necessary.” The lead author was James Hansen.
  • Pomerance listened and watched. He understood Hansen’s basic findings well enough: Earth had been warming since 1880, and the warming would reach “almost unprecedented magnitude” in the next century, leading to the familiar suite of terrors, including the flooding of a 10th of New Jersey and a quarter of Louisiana and Florida. But Pomerance was excited to find that Hansen could translate the complexities of atmospheric science into plain English.
  • 7. ‘We’re All Going to Be the Victims’ March 1982
  • Gore had learned about climate change a dozen years earlier as an undergraduate at Harvard, when he took a class taught by Roger Revelle. Humankind was on the brink of radically transforming the global atmosphere, Revelle explained, drawing Keeling’s rising zigzag on the blackboard, and risked bringing about the collapse of civilization. Gore was stunned: Why wasn’t anyone talking about this?
  • Most in Congress considered the science committee a legislative backwater, if they considered it at all; this made Gore’s subcommittee, which had no legislative authority, an afterthought to an afterthought. That, Gore vowed, would change. Environmental and health stories had all the elements of narrative drama: villains, victims and heroes. In a hearing, you could summon all three, with the chairman serving as narrator, chorus and moral authority. He told his staff director that he wanted to hold a hearing every week.
  • The Revelle hearing went as Grumbly had predicted. The urgency of the issue was lost on Gore’s older colleagues, who drifted in and out while the witnesses testified. There were few people left by the time the Brookings Institution economist Lester Lave warned that humankind’s profligate exploitation of fossil fuels posed an existential test to human nature. “Carbon dioxide stands as a symbol now of our willingness to confront the future,” he said. “It will be a sad day when we decide that we just don’t have the time or thoughtfulness to address those issues.”
  • That night, the news programs featured the resolution of the baseball strike, the ongoing budgetary debate and the national surplus of butter.
  • There emerged, despite the general comity, a partisan divide. Unlike the Democrats, the Republicans demanded action. “Today I have a sense of déjà vu,” said Robert Walker, a Republican from Pennsylvania. In each of the last five years, he said, “we have been told and told and told that there is a problem with the increasing carbon dioxide in the atmosphere. We all accept that fact, and we realize that the potential consequences are certainly major in their impact on mankind.” Yet they had failed to propose a single law. “Now is the time,” he said. “The research is clear. It is up to us now to summon the political will.”
  • Hansen flew to Washington to testify on March 25, 1982, performing before a gallery even more thinly populated than at Gore’s first hearing on the greenhouse effect. Gore began by attacking the Reagan administration for cutting funding for carbon-dioxide research despite the “broad consensus in the scientific community that the greenhouse effect is a reality.” William Carney, a Republican from New York, bemoaned the burning of fossil fuels and argued passionately that science should serve as the basis for legislative policy
  • the experts invited by Gore agreed with the Republicans: The science was certain enough. Melvin Calvin, a Berkeley chemist who won the Nobel Prize for his work on the carbon cycle, said that it was useless to wait for stronger evidence of warming. “You cannot do a thing about it when the signals are so big that they come out of the noise,” he said. “You have to look for early warning signs.”
  • Hansen’s job was to share the warning signs, to translate the data into plain English. He explained a few discoveries that his team had made — not with computer models but in libraries. By analyzing records from hundreds of weather stations, he found that the surface temperature of the planet had already increased four-tenths of a degree Celsius in the previous century. Data from several hundred tide-gauge stations showed that the oceans had risen four inches since the 1880s
  • It occurred to Hansen that this was the only political question that mattered: How long until the worst began? It was not a question on which geophysicists expended much effort; the difference between five years and 50 years in the future was meaningless in geologic time. Politicians were capable of thinking only in terms of electoral time: six years, four years, two years. But when it came to the carbon problem, the two time schemes were converging.
  • “Within 10 or 20 years,” Hansen said, “we will see climate changes which are clearly larger than the natural variability.” James Scheuer wanted to make sure he understood this correctly. No one else had predicted that the signal would emerge that quickly. “If it were one or two degrees per century,” he said, “that would be within the range of human adaptability. But we are pushing beyond the range of human adaptability.” “Yes,” Hansen said.
  • How soon, Scheuer asked, would they have to change the national model of energy production? Hansen hesitated — it wasn’t a scientific question. But he couldn’t help himself. He had been irritated, during the hearing, by all the ludicrous talk about the possibility of growing more trees to offset emissions. False hopes were worse than no hope at all: They undermined the prospect of developing real solutions. “That time is very soon,” Hansen said finally. “My opinion is that it is past,” Calvin said, but he was not heard because he spoke from his seat. He was told to speak into the microphone. “It is already later,” Calvin said, “than you think.”
  • From Gore’s perspective, the hearing was an unequivocal success. That night Dan Rather devoted three minutes of “CBS Evening News” to the greenhouse effect. A correspondent explained that temperatures had increased over the previous century, great sheets of pack ice in Antarctica were rapidly melting, the seas were rising; Calvin said that “the trend is all in the direction of an impending catastrophe”; and Gore mocked Reagan for his shortsightedness. Later, Gore could take credit for protecting the Energy Department’s carbon-dioxide program, which in the end was largely preserved.
  • 8. ‘The Direction of an Impending Catastrophe’ 1982
  • Following Henry Shaw’s recommendation to establish credibility ahead of any future legislative battles, Exxon had begun to spend conspicuously on global-warming research. It donated tens of thousands of dollars to some of the most prominent research efforts, including one at Woods Hole led by the ecologist George Woodwell, who had been calling for major climate policy as early as the mid-1970s, and an international effort coordinated by the United Nations. Now Shaw offered to fund the October 1982 symposium on climate change at Columbia’s Lamont-Doherty campus.
  • David boasted that Exxon would usher in a new global energy system to save the planet from the ravages of climate change. He went so far as to argue that capitalism’s blind faith in the wisdom of the free market was “less than satisfying” when it came to the greenhouse effect. Ethical considerations were necessary, too. He pledged that Exxon would revise its corporate strategy to account for climate change, even if it were not “fashionable” to do so. As Exxon had already made heavy investments in nuclear and solar technology, he was “generally upbeat” that Exxon would “invent” a future of renewable energy.
  • Hansen had reason to feel upbeat himself. If the world’s largest oil-and-gas company supported a new national energy model, the White House would not stand in its way. The Reagan administration was hostile to change from within its ranks. But it couldn’t be hostile to Exxon.
  • The carbon-dioxide issue was beginning to receive major national attention — Hansen’s own findings had become front-page news, after all. What started as a scientific story was turning into a political story.
  • The political realm was itself a kind of Mirror World, a parallel reality that crudely mimicked our own. It shared many of our most fundamental laws, like the laws of gravity and inertia and publicity. And if you applied enough pressure, the Mirror World of politics could be sped forward to reveal a new future. Hansen was beginning to understand that too.
  • 1. ‘Caution, Not Panic’ 1983-1984
  • in the fall of 1983, the climate issue entered an especially long, dark winter. And all because of a single report that had done nothing to change the state of climate science but transformed the state of climate politics.
  • After the publication of the Charney report in 1979, Jimmy Carter had directed the National Academy of Sciences to prepare a comprehensive, $1 million analysis of the carbon-dioxide problem: a Warren Commission for the greenhouse effect. A team of scientist-dignitaries — among them Revelle, the Princeton modeler Syukuro Manabe and the Harvard political economist Thomas Schelling, one of the intellectual architects of Cold War game theory — would review the literature, evaluate the consequences of global warming for the world order and propose remedies
  • Then Reagan won the White House.
  • the incipient report served as the Reagan administration’s answer to every question on the subject. There could be no climate policy, Fred Koomanoff and his associates said, until the academy ruled. In the Mirror World of the Reagan administration, the warming problem hadn’t been abandoned at all. A careful, comprehensive solution was being devised. Everyone just had to wait for the academy’s elders to explain what it was.
  • The committee’s chairman, William Nierenberg — a Jason, presidential adviser and director of Scripps, the nation’s pre-eminent oceanographic institution — argued that action had to be taken immediately, before all the details could be known with certainty, or else it would be too late.
  • Better to bet on American ingenuity to save the day. Major interventions in national energy policy, taken immediately, might end up being more expensive, and less effective, than actions taken decades in the future, after more was understood about the economic and social consequences of a warmer planet. Yes, the climate would change, mostly for the worst, but future generations would be better equipped to change with it.
  • Government officials who knew Nierenberg were not surprised by his conclusions: He was an optimist by training and experience, a devout believer in the doctrine of American exceptionalism, one of the elite class of scientists who had helped the nation win a global war, invent the most deadly weapon conceivable and create the booming aerospace and computer industries. America had solved every existential problem it had confronted over the previous generation; it would not be daunted by an excess of carbon dioxide. Nierenberg had also served on Reagan’s transition team. Nobody believed that he had been directly influenced by his political connections, but his views — optimistic about the saving graces of market forces, pessimistic about the value of government regulation — reflected all the ardor of his party.
  • That’s what Nierenberg wrote in “Changing Climate.” But it’s not what he said in the press interviews that followed. He argued the opposite: There was no urgent need for action. The public should not entertain the most “extreme negative speculations” about climate change (despite the fact that many of those speculations appeared in his report). Though “Changing Climate” urged an accelerated transition to renewable fuels, noting that it would take thousands of years for the atmosphere to recover from the damage of the last century, Nierenberg recommended “caution, not panic.” Better to wait and see
  • The damage of “Changing Climate” was squared by the amount of attention it received. Nierenberg’s speech in the Great Hall, being one-500th the length of the actual assessment, received 500 times the press coverage. As The Wall Street Journal put it, in a line echoed by trade journals across the nation: “A panel of top scientists has some advice for people worried about the much-publicized warming of the Earth’s climate: You can cope.”
  • On “CBS Evening News,” Dan Rather said the academy had given “a cold shoulder” to a grim, 200-page E.P.A. assessment published earlier that week (titled “Can We Delay a Greenhouse Warming?”; the E.P.A.’s answer, reduced to a word, was no). The Washington Post described the two reports, taken together, as “clarion calls to inaction.
  • George Keyworth II, Reagan’s science adviser. Keyworth used Nierenberg’s optimism as reason to discount the E.P.A.’s “unwarranted and unnecessarily alarmist” report and warned against taking any “near-term corrective action” on global warming. Just in case it wasn’t clear, Keyworth added, “there are no actions recommended other than continued research.”
  • Edward David Jr., two years removed from boasting of Exxon’s commitment to transforming global energy policy, told Science that the corporation had reconsidered. “Exxon has reverted to being mainly a supplier of conventional hydrocarbon fuels — petroleum products, natural gas and steam coal,” David said. The American Petroleum Institute canceled its own carbon-dioxide research program, too.
  • Exxon soon revised its position on climate-change research. In a presentation at an industry conference, Henry Shaw cited “Changing Climate” as evidence that “the general consensus is that society has sufficient time to technologically adapt to a CO₂ greenhouse effect.” If the academy had concluded that regulations were not a serious option, why should Exxon protest
  • 2. ‘You Scientists Win’ 1985
  • 3. The Size of The Human Imagination Spring-Summer 1986
  • Curtis Moore’s proposal: Use ozone to revive climate. The ozone hole had a solution — an international treaty, already in negotiation. Why not hitch the milk wagon to the bullet train? Pomerance was skeptical. The problems were related, sure: Without a reduction in CFC emissions, you didn’t have a chance of averting cataclysmic global warming. But it had been difficult enough to explain the carbon issue to politicians and journalists; why complicate the sales pitch? Then again, he didn’t see what choice he had. The Republicans controlled the Senate, and Moore was his connection to the Senate’s environmental committee.
  • Pomerance met with Senator John Chafee, a Republican from Rhode Island, and helped persuade him to hold a double-barreled hearing on the twin problems of ozone and carbon dioxide on June 10 and 11, 1986
  • F.Sherwood Rowland, Robert Watson, a NASA scientist, and Richard Benedick, the administration’s lead representative in international ozone negotiations, would discuss ozone; James Hansen, Al Gore, the ecologist George Woodwell and Carl Wunsch, a veteran of the Charney group, would testify about climate change.
  • As Pomerance had hoped, fear about the ozone layer ensured a bounty of press coverage for the climate-change testimony. But as he had feared, it caused many people to conflate the two crises. One was Peter Jennings, who aired the video on ABC’s “World News Tonight,” warning that the ozone hole “could lead to flooding all over the world, also to drought and to famine.”
  • The confusion helped: For the first time since the “Changing Climate” report, global-warming headlines appeared by the dozen. William Nierenberg’s “caution, not panic” line was inverted. It was all panic without a hint of caution: “A Dire Forecast for ‘Greenhouse’ Earth” (the front page of The Washington Post); “Scientists Predict Catastrophes in Growing Global Heat Wave” (Chicago Tribune); “Swifter Warming of Globe Foreseen” (The New York Times).
  • After three years of backsliding and silence, Pomerance was exhilarated to see interest in the issue spike overnight. Not only that: A solution materialized, and a moral argument was passionately articulated — by Rhode Island’s Republican senator no less. “Ozone depletion and the greenhouse effect can no longer be treated solely as important scientific questions,” Chafee said. “They must be seen as critical problems facing the nations of the world, and they are problems that demand solutions.”
  • The old canard about the need for more research was roundly mocked — by Woodwell, by a W.R.I. colleague named Andrew Maguire, by Senator George Mitchell, a Democrat from Maine. “Scientists are never 100 percent certain,” the Princeton historian Theodore Rabb testified. “That notion of total certainty is something too elusive ever to be sought.” As Pomerance had been saying since 1979, it was past time to act. Only now the argument was so broadly accepted that nobody dared object.
  • The ozone hole, Pomerance realized, had moved the public because, though it was no more visible than global warming, people could be made to see it. They could watch it grow on video. Its metaphors were emotionally wrought: Instead of summoning a glass building that sheltered plants from chilly weather (“Everything seems to flourish in there”), the hole evoked a violent rending of the firmament, inviting deathly radiation. Americans felt that their lives were in danger. An abstract, atmospheric problem had been reduced to the size of the human imagination. It had been made just small enough, and just large enough, to break through.
  • Four years after “Changing Climate,” two years after a hole had torn open the firmament and a month after the United States and more than three dozen other nations signed a treaty to limit use of CFCs, the climate-change corps was ready to celebrate. It had become conventional wisdom that climate change would follow ozone’s trajectory. Reagan’s E.P.A. administrator, Lee M. Thomas, said as much the day he signed the Montreal Protocol on Substances That Deplete the Ozone Layer (the successor to the Vienna Convention), telling reporters that global warming was likely to be the subject of a future international agreement
  • Congress had already begun to consider policy — in 1987 alone, there were eight days of climate hearings, in three committees, across both chambers of Congress; Senator Joe Biden, a Delaware Democrat, had introduced legislation to establish a national climate-change strategy. And so it was that Jim Hansen found himself on Oct. 27 in the not especially distinguished ballroom of the Quality Inn on New Jersey Avenue, a block from the Capitol, at “Preparing for Climate Change,” which was technically a conference but felt more like a wedding.
  • John Topping was an old-line Rockefeller Republican, a Commerce Department lawyer under Nixon and an E.P.A. official under Reagan. He first heard about the climate problem in the halls of the E.P.A. in 1982 and sought out Hansen, who gave him a personal tutorial. Topping was amazed to discover that out of the E.P.A.’s 13,000-person staff, only seven people, by his count, were assigned to work on climate, though he figured it was more important to the long-term security of the nation than every other environmental issue combined.
  • Glancing around the room, Jim Hansen could chart, like an arborist counting rings on a stump, the growth of the climate issue over the decade. Veterans like Gordon MacDonald, George Woodwell and the environmental biologist Stephen Schneider stood at the center of things. Former and current staff members from the congressional science committees (Tom Grumbly, Curtis Moore, Anthony Scoville) made introductions to the congressmen they advised. Hansen’s owlish nemesis Fred Koomanoff was present, as were his counterparts from the Soviet Union and Western Europe. Rafe Pomerance’s cranium could be seen above the crowd, but unusually he was surrounded by colleagues from other environmental organizations that until now had shown little interest in a diffuse problem with no proven fund-raising record. The party’s most conspicuous newcomers, however, the outermost ring, were the oil-and-gas executives.
  • That evening, as a storm spat and coughed outside, Rafe Pomerance gave one of his exhortative speeches urging cooperation among the various factions, and John Chafee and Roger Revelle received awards; introductions were made and business cards earnestly exchanged. Not even a presentation by Hansen of his research could sour the mood. The next night, on Oct. 28, at a high-spirited dinner party in Topping’s townhouse on Capitol Hill, the oil-and-gas men joked with the environmentalists, the trade-group representatives chatted up the regulators and the academics got merrily drunk. Mikhail Budyko, the don of the Soviet climatologists, settled into an extended conversation about global warming with Topping’s 10-year-old son. It all seemed like the start of a grand bargain, a uniting of factions — a solution.
  • Hansen was accustomed to the bureaucratic nuisances that attended testifying before Congress; before a hearing, he had to send his formal statement to NASA headquarters, which forwarded it to the White House’s Office of Management and Budget for approval. “Major greenhouse climate changes are a certainty,” he had written. “By the 2010s [in every scenario], essentially the entire globe has very substantial warming.”
  • By all appearances, plans for major policy continued to advance rapidly. After the Johnston hearing, Timothy Wirth, a freshman Democratic senator from Colorado on the energy committee, began to plan a comprehensive package of climate-change legislation — a New Deal for global warming. Wirth asked a legislative assistant, David Harwood, to consult with experts on the issue, beginning with Rafe Pomerance, in the hope of converting the science of climate change into a new national energy policy.
  • In March 1988, Wirth joined 41 other senators, nearly half of them Republicans, to demand that Reagan call for an international treaty modeled after the ozone agreement. Because the United States and the Soviet Union were the world’s two largest contributors of carbon emissions, responsible for about one-third of the world total, they should lead the negotiations. Reagan agreed. In May, he signed a joint statement with Mikhail Gorbachev that included a pledge to cooperate on global warming.
  • Al Gore himself had, for the moment, withdrawn his political claim to the issue. In 1987, at the age of 39, Gore announced that he was running for president, in part to bring attention to global warming, but he stopped emphasizing it after the subject failed to captivate New Hampshire primary voters.
  • 5. ‘You Will See Things That You Shall Believe’ Summer 1988
  • It was the hottest and driest summer in history. Everywhere you looked, something was bursting into flames. Two million acres in Alaska incinerated, and dozens of major fires scored the West. Yellowstone National Park lost nearly one million acres. Smoke was visible from Chicago, 1,600 miles away.
  • In Nebraska, suffering its worst drought since the Dust Bowl, there were days when every weather station registered temperatures above 100 degrees. The director of the Kansas Department of Health and Environment warned that the drought might be the dawning of a climatic change that within a half century could turn the state into a desert.
  • On June 22 in Washington, where it hit 100 degrees, Rafe Pomerance received a call from Jim Hansen, who was scheduled to testify the following morning at a Senate hearing called by Timothy Wirth. “I hope we have good media coverage tomorrow,” Hansen said.
  • Hansen had just received the most recent global temperature data. Just over halfway into the year, 1988 was setting records. Already it had nearly clinched the hottest year in history. Ahead of schedule, the signal was emerging from the noise. “I’m going to make a pretty strong statement,” Hansen said.
  • Hansen returned to his testimony. He wrote: “The global warming is now large enough that we can ascribe with a high degree of confidence a cause-and-effect relationship to the greenhouse effect.” He wrote: “1988 so far is so much warmer than 1987, that barring a remarkable and improbable cooling, 1988 will be the warmest year on record.” He wrote: “The greenhouse effect has been detected, and it is changing our climate now.”
  • “We have only one planet,” Senator Bennett Johnston intoned. “If we screw it up, we have no place to go.” Senator Max Baucus, a Democrat from Montana, called for the United Nations Environment Program to begin preparing a global remedy to the carbon-dioxide problem. Senator Dale Bumpers, a Democrat of Arkansas, previewed Hansen’s testimony, saying that it “ought to be cause for headlines in every newspaper in America tomorrow morning.” The coverage, Bumpers emphasized, was a necessary precursor to policy. “Nobody wants to take on any of the industries that produce the things that we throw up into the atmosphere,” he said. “But what you have are all these competing interests pitted against our very survival.”
  • Hansen, wiping his brow, spoke without affect, his eyes rarely rising from his notes. The warming trend could be detected “with 99 percent confidence,” he said. “It is changing our climate now.” But he saved his strongest comment for after the hearing, when he was encircled in the hallway by reporters. “It is time to stop waffling so much,” he said, “and say that the evidence is pretty strong that the greenhouse effect is here.”
  • The press followed Bumpers’s advice. Hansen’s testimony prompted headlines in dozens of newspapers across the country, including The New York Times, which announced, across the top of its front page: “Global Warming Has Begun, Expert Tells Senate.”
  • Rafe Pomerance called his allies on Capitol Hill, the young staff members who advised politicians, organized hearings, wrote legislation. We need to finalize a number, he told them, a specific target, in order to move the issue — to turn all this publicity into policy. The Montreal Protocol had called for a 50 percent reduction in CFC emissions by 1998. What was the right target for carbon emissions? It wasn’t enough to exhort nations to do better. That kind of talk might sound noble, but it didn’t change investments or laws. They needed a hard goal — something ambitious but reasonable. And they needed it soon: Just four days after Hansen’s star turn, politicians from 46 nations and more than 300 scientists would convene in Toronto at the World Conference on the Changing Atmosphere, an event described by Philip Shabecoff of The New York Times as “Woodstock for climate change.”
  • Pomerance had a proposal: a 20 percent reduction in carbon emissions by 2000. Ambitious, Harwood said. In all his work planning climate policy, he had seen no assurance that such a steep drop in emissions was possible. Then again, 2000 was more than a decade off, so it allowed for some flexibility.
  • Mintzer pointed out that a 20 percent reduction was consistent with the academic literature on energy efficiency. Various studies over the years had shown that you could improve efficiency in most energy systems by roughly 20 percent if you adopted best practices.
  • Of course, with any target, you had to take into account the fact that the developing world would inevitably consume much larger quantities of fossil fuels by 2000. But those gains could be offset by a wider propagation of the renewable technologies already at hand — solar, wind, geothermal. It was not a rigorous scientific analysis, Mintzer granted, but 20 percent sounded plausible. We wouldn’t need to solve cold fusion or ask Congress to repeal the law of gravity. We could manage it with the knowledge and technology we already had.
  • Besides, Pomerance said, 20 by 2000 sounds good.
  • The conference’s final statement, signed by all 400 scientists and politicians in attendance, repeated the demand with a slight variation: a 20 percent reduction in carbon emissions by 2005. Just like that, Pomerance’s best guess became global diplomatic policy.
  • Hansen, emerging from Anniek’s successful cancer surgery, took it upon himself to start a one-man public information campaign. He gave news conferences and was quoted in seemingly every article about the issue; he even appeared on television with homemade props. Like an entrant at an elementary-school science fair, he made “loaded dice” out of sections of cardboard and colored paper to illustrate the increased likelihood of hotter weather in a warmer climate. Public awareness of the greenhouse effect reached a new high of 68 percent
  • global warming became a major subject of the presidential campaign. While Michael Dukakis proposed tax incentives to encourage domestic oil production and boasted that coal could satisfy the nation’s energy needs for the next three centuries, George Bush took advantage. “I am an environmentalist,” he declared on the shore of Lake Erie, the first stop on a five-state environmental tour that would take him to Boston Harbor, Dukakis’s home turf. “Those who think we are powerless to do anything about the greenhouse effect,” he said, “are forgetting about the White House effect.”
  • His running mate emphasized the ticket’s commitment to the issue at the vice-presidential debate. “The greenhouse effect is an important environmental issue,” Dan Quayle said. “We need to get on with it. And in a George Bush administration, you can bet that we will.”
  • This kind of talk roused the oil-and-gas men. “A lot of people on the Hill see the greenhouse effect as the issue of the 1990s,” a gas lobbyist told Oil & Gas Journal. Before a meeting of oil executives shortly after the “environmentalist” candidate won the election, Representative Dick Cheney, a Wyoming Republican, warned, “It’s going to be very difficult to fend off some kind of gasoline tax.” The coal industry, which had the most to lose from restrictions on carbon emissions, had moved beyond denial to resignation. A spokesman for the National Coal Association acknowledged that the greenhouse effect was no longer “an emerging issue. It is here already, and we’ll be hearing more and more about it.”
  • By the end of the year, 32 climate bills had been introduced in Congress, led by Wirth’s omnibus National Energy Policy Act of 1988. Co-sponsored by 13 Democrats and five Republicans, it established as a national goal an “International Global Agreement on the Atmosphere by 1992,” ordered the Energy Department to submit to Congress a plan to reduce energy use by at least 2 percent a year through 2005 and directed the Congressional Budget Office to calculate the feasibility of a carbon tax. A lawyer for the Senate energy committee told an industry journal that lawmakers were “frightened” by the issue and predicted that Congress would eventually pass significant legislation after Bush took office
  • The other great powers refused to wait. The German Parliament created a special commission on climate change, which concluded that action had to be taken immediately, “irrespective of any need for further research,” and that the Toronto goal was inadequate; it recommended a 30 percent reduction of carbon emissions
  • Margaret Thatcher, who had studied chemistry at Oxford, warned in a speech to the Royal Society that global warming could “greatly exceed the capacity of our natural habitat to cope” and that “the health of the economy and the health of our environment are totally dependent upon each other.”
  • The prime ministers of Canada and Norway called for a binding international treaty on the atmosphere; Sweden’s Parliament went further, announcing a national strategy to stabilize emissions at the 1988 level and eventually imposing a carbon tax
  • the United Nations unanimously endorsed the establishment, by the World Meteorological Organization and the United Nations Environment Program, of an Intergovernmental Panel on Climate Change, composed of scientists and policymakers, to conduct scientific assessments and develop global climate policy.
  • One of the I.P.C.C.’s first sessions to plan an international treaty was hosted by the State Department, 10 days after Bush’s inauguration. James Baker chose the occasion to make his first speech as secretary of state. “We can probably not afford to wait until all of the uncertainties about global climate change have been resolved,” he said. “Time will not make the problem go away.”
  • : On April 14, 1989, a bipartisan group of 24 senators, led by the majority leader, George Mitchell, requested that Bush cut emissions in the United States even before the I.P.C.C.’s working group made its recommendation. “We cannot afford the long lead times associated with a comprehensive global agreement,” the senators wrote. Bush had promised to combat the greenhouse effect with the White House effect. The self-proclaimed environmentalist was now seated in the Oval Office. It was time.
  • 8. ‘You Never Beat The White House’ April 1989
  • After Jim Baker gave his boisterous address to the I.P.C.C. working group at the State Department, he received a visit from John Sununu, Bush’s chief of staff. Leave the science to the scientists, Sununu told Baker. Stay clear of this greenhouse-effect nonsense. You don’t know what you’re talking about. Baker, who had served as Reagan’s chief of staff, didn’t speak about the subject again.
  • despite his reputation as a political wolf, he still thought of himself as a scientist — an “old engineer,” as he was fond of putting it, having earned a Ph.D. in mechanical engineering from M.I.T. decades earlier. He lacked the reflexive deference that so many of his political generation reserved for the class of elite government scientists.
  • Since World War II, he believed, conspiratorial forces had used the imprimatur of scientific knowledge to advance an “anti-growth” doctrine. He reserved particular disdain for Paul Ehrlich’s “The Population Bomb,” which prophesied that hundreds of millions of people would starve to death if the world took no step to curb population growth; the Club of Rome, an organization of European scientists, heads of state and economists, which similarly warned that the world would run out of natural resources; and as recently as the mid-’70s, the hypothesis advanced by some of the nation’s most celebrated scientists — including Carl Sagan, Stephen Schneider and Ichtiaque Rasool — that a new ice age was dawning, thanks to the proliferation of man-made aerosols. All were theories of questionable scientific merit, portending vast, authoritarian remedies to halt economic progress.
  • When Mead talked about “far-reaching” decisions and “long-term consequences,” Sununu heard the marching of jackboots.
  • Sununu had suspected that the greenhouse effect belonged to this nefarious cabal since 1975, when the anthropologist Margaret Mead convened a symposium on the subject at the National Institute of Environmental Health Sciences.
  • While Sununu and Darman reviewed Hansen’s statements, the E.P.A. administrator, William K. Reilly, took a new proposal to the White House. The next meeting of the I.P.C.C.’s working group was scheduled for Geneva the following month, in May; it was the perfect occasion, Reilly argued, to take a stronger stand on climate change. Bush should demand a global treaty to reduce carbon emissions.
  • Sununu wouldn’t budge. He ordered the American delegates not to make any commitment in Geneva. Very soon after that, someone leaked the exchange to the press.
  • A deputy of Jim Baker pulled Reilly aside. He said he had a message from Baker, who had observed Reilly’s infighting with Sununu. “In the long run,” the deputy warned Reilly, “you never beat the White House.”
  • 9. ‘A Form of Science Fraud’ May 1989
  • The cameras followed Hansen and Gore into the marbled hallway. Hansen insisted that he wanted to focus on the science. Gore focused on the politics. “I think they’re scared of the truth,” he said. “They’re scared that Hansen and the other scientists are right and that some dramatic policy changes are going to be needed, and they don’t want to face up to it.”
  • The censorship did more to publicize Hansen’s testimony and the dangers of global warming than anything he could have possibly said. At the White House briefing later that morning, Press Secretary Marlin Fitzwater admitted that Hansen’s statement had been changed. He blamed an official “five levels down from the top” and promised that there would be no retaliation. Hansen, he added, was “an outstanding and distinguished scientist” and was “doing a great job.”
  • 10. The White House Effect Fall 1989
  • The Los Angeles Times called the censorship “an outrageous assault.” The Chicago Tribune said it was the beginning of “a cold war on global warming,” and The New York Times warned that the White House’s “heavy-handed intervention sends the signal that Washington wants to go slow on addressing the greenhouse problem.”
  • Darman went to see Sununu. He didn’t like being accused of censoring scientists. They needed to issue some kind of response. Sununu called Reilly to ask if he had any ideas. We could start, Reilly said, by recommitting to a global climate treaty. The United States was the only Western nation on record as opposing negotiations.
  • Sununu sent a telegram to Geneva endorsing a plan “to develop full international consensus on necessary steps to prepare for a formal treaty-negotiating process. The scope and importance of this issue are so great that it is essential for the U.S. to exercise leadership.”
  • Sununu seethed at any mention of the subject. He had taken it upon himself to study more deeply the greenhouse effect; he would have a rudimentary, one-dimensional general circulation model installed on his personal desktop computer. He decided that the models promoted by Jim Hansen were a lot of bunk. They were horribly imprecise in scale and underestimated the ocean’s ability to mitigate warming. Sununu complained about Hansen to D. Allan Bromley, a nuclear physicist from Yale who, at Sununu’s recommendation, was named Bush’s science adviser. Hansen’s findings were “technical poppycock” that didn’t begin to justify such wild-eyed pronouncements that “the greenhouse effect is here” or that the 1988 heat waves could be attributed to global warming, let alone serve as the basis for national economic policy.
  • When a junior staff member in the Energy Department, in a meeting at the White House with Sununu and Reilly, mentioned an initiative to reduce fossil-fuel use, Sununu interrupted her. “Why in the world would you need to reduce fossil-fuel use?” he asked. “Because of climate change,” the young woman replied. “I don’t want anyone in this administration without a scientific background using ‘climate change’ or ‘global warming’ ever again,” he said. “If you don’t have a technical basis for policy, don’t run around making decisions on the basis of newspaper headlines.” After the meeting, Reilly caught up to the staff member in the hallway. She was shaken. Don’t take it personally, Reilly told her. Sununu might have been looking at you, but that was directed at me.
  • Reilly, for his part, didn’t entirely blame Sununu for Bush’s indecision on the prospect of a climate treaty. The president had never taken a vigorous interest in global warming and was mainly briefed about it by nonscientists. Bush had brought up the subject on the campaign trail, in his speech about the White House effect, after leafing through a briefing booklet for a new issue that might generate some positive press. When Reilly tried in person to persuade him to take action, Bush deferred to Sununu and Baker. Why don’t the three of you work it out, he said. Let me know when you decide
  • Relations between Sununu and Reilly became openly adversarial. Reilly, Sununu thought, was a creature of the environmental lobby. He was trying to impress his friends at the E.P.A. without having a basic grasp of the science himself.
  • Pomerance had the sinking feeling that the momentum of the previous year was beginning to flag. The censoring of Hansen’s testimony and the inexplicably strident opposition from John Sununu were ominous signs. So were the findings of a report Pomerance had commissioned, published in September by the World Resources Institute, tracking global greenhouse-gas emissions. The United States was the largest contributor by far, producing nearly a quarter of the world’s carbon emissions, and its contribution was growing faster than that of every other country. Bush’s indecision, or perhaps inattention, had already managed to delay the negotiation of a global climate treaty until 1990 at the earliest, perhaps even 1991. By then, Pomerance worried, it would be too late.
  • Pomerance tried to be more diplomatic. “The president made a commitment to the American people to deal with global warming,” he told The Washington Post, “and he hasn’t followed it up.” He didn’t want to sound defeated. “There are some good building blocks here,” Pomerance said, and he meant it. The Montreal Protocol on CFCs wasn’t perfect at first, either — it had huge loopholes and weak restrictions. Once in place, however, the restrictions could be tightened. Perhaps the same could happen with climate change. Perhaps. Pomerance was not one for pessimism. As William Reilly told reporters, dutifully defending the official position forced upon him, it was the first time that the United States had formally endorsed the concept of an emissions limit. Pomerance wanted to believe that this was progress.
  • All week in Noordwijk, Becker couldn’t stop talking about what he had seen in Zeeland. After a flood in 1953, when the sea swallowed much of the region, killing more than 2,000 people, the Dutch began to build the Delta Works, a vast concrete-and-steel fortress of movable barriers, dams and sluice gates — a masterpiece of human engineering. The whole system could be locked into place within 90 minutes, defending the land against storm surge. It reduced the country’s exposure to the sea by 700 kilometers, Becker explained. The United States coastline was about 153,000 kilometers long. How long, he asked, was the entire terrestrial coastline? Because the whole world was going to need this. In Zeeland, he said, he had seen the future.
  • Ken Caldeira, a climate scientist at the Carnegie Institution for Science in Stanford, Calif., has a habit of asking new graduate students to name the largest fundamental breakthrough in climate physics since 1979. It’s a trick question. There has been no breakthrough. As with any mature scientific discipline, there is only refinement. The computer models grow more precise; the regional analyses sharpen; estimates solidify into observational data. Where there have been inaccuracies, they have tended to be in the direction of understatement.
  • More carbon has been released into the atmosphere since the final day of the Noordwijk conference, Nov. 7, 1989, than in the entire history of civilization preceding it
  • Despite every action taken since the Charney report — the billions of dollars invested in research, the nonbinding treaties, the investments in renewable energy — the only number that counts, the total quantity of global greenhouse gas emitted per year, has continued its inexorable rise.
  • When it comes to our own nation, which has failed to make any binding commitments whatsoever, the dominant narrative for the last quarter century has concerned the efforts of the fossil-fuel industries to suppress science, confuse public knowledge and bribe politicians.
  • The mustache-twirling depravity of these campaigns has left the impression that the oil-and-gas industry always operated thus; while the Exxon scientists and American Petroleum Institute clerics of the ’70s and ’80s were hardly good Samaritans, they did not start multimillion-dollar disinformation campaigns, pay scientists to distort the truth or try to brainwash children in elementary schools, as their successors would.
  • It was James Hansen’s testimony before Congress in 1988 that, for the first time since the “Changing Climate” report, made oil-and-gas executives begin to consider the issue’s potential to hurt their profits. Exxon, as ever, led the field. Six weeks after Hansen’s testimony, Exxon’s manager of science and strategy development, Duane LeVine, prepared an internal strategy paper urging the company to “emphasize the uncertainty in scientific conclusions.” This shortly became the default position of the entire sector. LeVine, it so happened, served as chairman of the global petroleum industry’s Working Group on Global Climate Change, created the same year, which adopted Exxon’s position as its own
  • The American Petroleum Institute, after holding a series of internal briefings on the subject in the fall and winter of 1988, including one for the chief executives of the dozen or so largest oil companies, took a similar, if slightly more diplomatic, line. It set aside money for carbon-dioxide policy — about $100,000, a fraction of the millions it was spending on the health effects of benzene, but enough to establish a lobbying organization called, in an admirable flourish of newspeak, the Global Climate Coalition.
  • The G.C.C. was conceived as a reactive body, to share news of any proposed regulations, but on a whim, it added a press campaign, to be coordinated mainly by the A.P.I. It gave briefings to politicians known to be friendly to the industry and approached scientists who professed skepticism about global warming. The A.P.I.’s payment for an original op-ed was $2,000.
  • It was joined by the U.S. Chamber of Commerce and 14 other trade associations, including those representing the coal, electric-grid and automobile industries
  • In October 1989, scientists allied with the G.C.C. began to be quoted in national publications, giving an issue that lacked controversy a convenient fulcrum. “Many respected scientists say the available evidence doesn’t warrant the doomsday warnings,” was the caveat that began to appear in articles on climate change.
  • The following year, when President Bill Clinton proposed an energy tax in the hope of meeting the goals of the Rio treaty, the A.P.I. invested $1.8 million in a G.C.C. disinformation campaign. Senate Democrats from oil-and-coal states joined Republicans to defeat the tax proposal, which later contributed to the Republicans’ rout of Democrats in the midterm congressional elections in 1994 — the first time the Republican Party had won control of both houses in 40 years
  • The G.C.C. spent $13 million on a single ad campaign intended to weaken support for the 1997 Kyoto Protocol, which committed its parties to reducing greenhouse-gas emissions by 5 percent relative to 1990 levels. The Senate, which would have had to ratify the agreement, took a pre-emptive vote declaring its opposition; the resolution passed 95-0. There has never been another serious effort to negotiate a binding global climate treaty.
  • . This has made the corporation an especially vulnerable target for the wave of compensatory litigation that began in earnest in the last three years and may last a generation. Tort lawsuits have become possible only in recent years, as scientists have begun more precisely to attribute regional effects to global emission levels. This is one subfield of climate science that has advanced significantly sin
  • Pomerance had not been among the 400 delegates invited to Noordwijk. But together with three young activists — Daniel Becker of the Sierra Club, Alden Meyer of the Union of Concerned Scientists and Stewart Boyle from Friends of the Earth — he had formed his own impromptu delegation. Their constituency, they liked to say, was the climate itself. Their mission was to pressure the delegates to include in the final conference statement, which would be used as the basis for a global treaty, the target proposed in Toronto: a 20 percent reduction of greenhouse-gas combustion by 2005. It was the only measure that mattered, the amount of emissions reductions, and the Toronto number was the strongest global target yet proposed.
  • The delegations would review the progress made by the I.P.C.C. and decide whether to endorse a framework for a global treaty. There was a general sense among the delegates that they would, at minimum, agree to the target proposed by the host, the Dutch environmental minister, more modest than the Toronto number: a freezing of greenhouse-gas emissions at 1990 levels by 2000. Some believed that if the meeting was a success, it would encourage the I.P.C.C. to accelerate its negotiations and reach a decision about a treaty sooner. But at the very least, the world’s environmental ministers should sign a statement endorsing a hard, binding target of emissions reductions. The mood among the delegates was electric, nearly giddy — after more than a decade of fruitless international meetings, they could finally sign an agreement that meant something.
  • 11. ‘The Skunks at The Garden Party’ November 1989
  • It was nearly freezing — Nov. 6, 1989, on the coast of the North Sea in the Dutch resort town of Noordwijk
  • Losing Earth: The Decade WeAlmost Stopped Climate Change We knew everything we needed to know, and nothing stood in our way. Nothing, that is, except ourselves. A tragedy in two acts. By Nathaniel RichPhotographs and Videos by George Steinmetz AUG. 1, 2018
Javier E

Opinion | Are We on the Cusp of a New Political Order? - The New York Times - 1 views

  • Gary Gerstle: A political order is a way of thinking differently about political time in America. We focus so much on two-, four- and six-year election cycles. A political order is something that lasts beyond particular elections, that refers to the ability of one political party to arrange a constellation of policies, constituencies, think tanks, candidates, individuals who come to dominate politics for extended periods of time. And their dominance becomes so strong that the opposition party feels compelled — if they still want to remain real players in American politics — it compels them to acquiesce and to come aboard the other political party’s platform.
  • They usually last 30 or 40 years. Economic crisis is usually involved in the emergence of a new order and the breakup of the old. Every political order also has not only an ideology but a vision of a good life in America.
  • What constitutes a good life? Because that becomes really important in terms of selling the virtues of that political order to a mass base, which is something that has to be won and sustained in American politics in order for a political order to exist and thrive.
  • ...153 more annotations...
  • It was a revolutionary power that wanted to end capitalism everywhere, not just in the Soviet Union but all over Asia and Africa, North America, South America. They were gaining a lot of support in the decolonizing societies of Africa and Asia. America was not confident in the ability of its economy to have a permanent recovery from the Great Depression.
  • When I teach young people today, it’s hard for them to grasp the magnitude and the seriousness of the Cold War and how it shaped every aspect of American life. And the Soviet Union represented an existential threat to the United States.
  • What coheres to the New Deal is that the Republicans eventually submit to it. And that happens when Gen. Dwight D. Eisenhower beats Senator Robert A. Taft. So tell me a bit about the counterfactual there that you think almost happened. What led to Taft losing prominence in the Republican Party, and what might have happened if he hadn’t?
  • he was slow to get on the bandwagon in terms of the threat of China, the threat of Communist expansion, and that opened up an opportunity for another candidate, by the name of Dwight D. Eisenhower, to enter the presidential race in 1952 and to present a very different vision.
  • He was a Republican in a classical sense — small central government, devolved power to the states, suspicious of foreign entanglements — believing that America was protected by the two vast oceans and thus did not need a strong standing army, did not have to be involved in world affairs. And he was opposed to the New Deal.
  • He thought it was a form of tyranny. It was going to lead to collectivism, Soviet style. And he was poised in the 1940s to roll back the New Deal, and he was looking forward to the postwar period after the war emergency had passed. Of course, the war emergency would require a very strong state to mobilize armed forces, to mobilize the economy for the sake of fighting a world war.
  • They needed foreign markets. America wasn’t sure whether it would have them. And the capitalist class in America was scared to death by the Communist threat, and it had to be met everywhere, and America mobilizes for the Cold War to contain Communism everywhere where it appeared. And that required a standing army in quasi-peacetime of a sort that America had never experienced before, and Taft was profoundly uncomfortable with this.
  • my counterfactual is that, absent the Cold War, the New Deal, which we now regard as such a juggernaut, would be seen as a momentary blip like so many other progressive moments in American politics. And we would see it as a blip and not for what it became, which was a political order that dominated politics for 30 years.
  • So there’s been this conventional story of the New Deal era, which is that the fear of Communism, the fear of being painted as soft on Communism or soft on socialism, leads progressives to trim their sails, moderates the sort of left flank of New Dealism. You argue that that story misses what’s happening on the right.
  • the imperative of fighting the Communists caused Republicans to make even larger concessions than the Democrats did.” What were those concessions?
  • Well, the biggest concession was agreeing to an extraordinary system of progressive taxation.
  • The highest marginal tax rate in the 1940s during World War II reached 91 percent, a level that is inconceivable in America of the 21st century. Eisenhower wins the election in 1952. He has both houses of Congress. And quite extraordinarily, Eisenhower maintains the 91 percent taxation rate
  • I think what mattered to him was the Cold War. The Cold War had to be fought on two fronts: It had to be fought militarily — international containment of Communism — and that required enormous expenditures on national defense, which meant not simply a conventional army but the nuclear arms race.
  • Eisenhower understood that in order to win the ideological struggle of the Cold War — which was not simply an American-Soviet struggle, but it was a global struggle to convince all the peoples of what was then called the Third World to come with the capitalist way, to come with the American way. In order for that to happen, America had to demonstrate that it could give its ordinary citizens a good life.
  • America had to prove that it had the better system, and that meant you could not return to unrestrained American capitalism — you had to regulate it in the public interest.
  • And the other aspect of that, which he appreciated, was that in the 1950s, it was not clear whether the Soviet Union or the United States could provide a better life for its average citizen. The Soviet Union was still doing quite well in the 1950s.
  • And that meant taking money from the rich and redistributing it, narrowing the inequality between rich and poor. It meant supporting powerful labor movement and not trying to roll back the Wagner Act, which the labor movement regarded as its Magna Carta, a very strong piece of federal legislation that gave it unambiguous rights to organize and obligated employers to bargain collectively with them.
  • He felt that this had to be the way that America went. Maintenance of Social Security — really all the key New Deal reforms — he ended up maintaining because he thought this would be a critically important instrument for convincing not just ordinary Americans but people around the world that this would prove the superiority of the American way.
  • That is why he acquiesced to the New Deal order.
  • It’s a pervasive recognition among America’s business class. You say, “The fear of Communism made possible the class compromise between capital and labor that underwrote the New Deal order.”
  • And you say it wasn’t just here; this was also true in many of the social democracies in Europe after World War II. Tell me a bit about that class compromise and the role the Cold War played in it.
  • It is often said that socialism was weaker in America than it was elsewhere. And in many respects, that has been true.
  • The corollary of that is that the American business class historically has been bigger, more powerful, more unencumbered than the business classes of other nations, especially in Western Europe among America’s industrial rivals. There was no shortage of labor protest in America, but rarely could labor achieve what it wanted to achieve because the resistance was extraordinary, the resistance was legal, it was extralegal.
  • The national security argument is crucial to getting large segments of the Republican Party on board. For them, the greatest threat, both internationally and domestically, was the Communist threat. And thus, they were willing to extend themselves beyond a point where they otherwise would have gone
  • I argue that it was the fear of the Soviet Union. And what did the fear of the Soviet Union represent? The expropriation of all corporate capital in the world. That was the Communist dream. And that was deeply felt. And it was felt not simply in a global setting. It was felt within the United States itself,
  • The history of industrial relations in America was very violent. The business class in America had a reputation of being very powerful and aggressive and unwilling to share its power with its antagonists. So what was it that got them to share that power?
  • it’s really remarkable to look at how closely the R. and D. state was designed and sold, in terms of its ability to keep America ahead for national defense. It has its roots in World War II, and it continues building much off that rhetoric.
  • so there’s this interesting way, I think we think of the New Deal in terms of Social Security. We think of it in terms of some of these individual programs. But it is this thoroughgoing expansion of the government into all kinds of areas of American life. And the thing that allows the Republican Party to get on board with a lot of that is this idea that if you don’t do that, well, the Soviets are going to do it
  • And the business class felt that it was in its interests to compromise with organized labor in a way that it had never done before. That was the grand compromise. It was symbolized in a treaty in Detroit between the three automobile makers, then among the biggest corporations in America, and the United Auto Workers — the Treaty of Detroit — purchasing labor peace by granting unions, good wages, good conditions, good pensions, good health care. Absent the threat of Communism, I think that grand compromise either would not have been arrived at or it would have been scuttled much sooner than it was.
  • they’re going to have the highways, or they’re going to have the technological or scientific superiority, they’re going to make it to the moon, etc., and then America is going to be left behind.
  • The vast education bills that are going to propel the tremendous growth of American universities in the 1960s and 1970s — which you mentioned about R. and D. — has a similar propulsion
  • the scale of this would not have reached the point that it did without getting a lot of Republicans on board. And the critical argument for them was national security, and a critical event was Sputnik, when Soviet Union shocks the United States by putting into orbit a satellite before the United States had done it.
  • that is a shocking moment: Oh, my God, America is falling behind. We must bend every muscle to beating the Soviet Union in every way, and that requires tremendous investments because of satellite technology and R. and D., and also that becomes the foundation of what is going to become the I.T. industry and the I.T. revolution — also a product of the Cold War.
  • How does that order end?
  • There are three factors that pull this order apart. The first is race, the second is Vietnam, and the third is the major economic recession of the 1970s.
  • Every political order has tensions within it in the United States. And the great contradiction in the New Deal Party of Franklin Roosevelt was the treatment of African Americans. In order to have a new political economy of a big state managing private capital in the public interest, Roosevelt had to get the South on board, and the South meant the white South.
  • And the entire promise of Western Europe prosperity and American university had been premised on the flow of unending supplies of very cheap Middle Eastern oil — most of them controlled by U.S. and British oil companies. And Saudi Arabia and other oil-producing nations in the 1970s say: No, these are our resources. We will determine how much is drawn out of the ground and the prices that they will be charged.
  • That was then complicated by Vietnam, a vastly unpopular war — inaugurated and presided over by Democratic presidents who were perceived by their own constituents to not be telling the truth about this awful quagmire.
  • It also inaugurated trade-offs between funding a war and funding Johnson’s beloved Great Society. Inflation began to take off.
  • the third element was profound changes in the international political economy. One of the reasons why America was able to enter its grand compromise between capital and labor and pay labor very high wages was that America had no serious industrial competition in the world from the ’40s to the ’60s.
  • Most of the industrialized world had been destroyed. The U.S. is actively helping the recovery of Western European economies, Japan, promoting development in Southeast Asia, and in the 1970s, these economies begin to challenge American supremacy economically. The symbol of that is the rise of Japanese car manufacturers
  • Roosevelt assented to that. But this was also a time, especially in the 1940s, when African Americans were migrating in huge numbers to the North, and they were becoming a constituency in the Democratic Party. This was the first point of crisis, and the Democratic Party found itself unable to contain the racial conflicts that exploded in the 1960s.
  • The quadrupling of oil prices leads to a profound economic crisis, along with competition from European nations against the United States. And this plunges the United States into a very unexpected and profound — and long — economic crisis known as stagflation. Inflation and unemployment are going up at the same time
  • None of the textbooks say this should be happening. The tools are no longer working. And it’s in this moment of crisis, the Democratic Party — this is the third strike against it — opens up an opportunity for alternative politics, an alternative party, an alternative plan for American political economy.
  • that sort of leaves out something that is happening among Democrats at this time. There’s a movement inside of liberalism. There’s the New Deal Democratic order, but you develop this New Left, and there is a movement of liberals against big government — young liberals for reasons of self-expression, for reasons of civil rights, for reasons of this feeling that they’re being fed into a bureaucracy and giant soulless organizations and eventually into the meat grinder of Vietnam
  • older liberals who are angry about the sort of reckless growth and the poisoning of streams and the building of highways through their communities and the sort of ticky-tacky rise of these suburbs. And this predates Reagan
  • Yes, the New Left erupts on university campuses in the 1960s, and the two primary issues in the beginning are race and Vietnam. But they also quite quickly develop a critique of the established order.
  • What was called at the time the system
  • what was the system? The system was large American corporations who were no longer under control. And one reason they were no longer under control is they were being aided and abetted by a large federal state that was supposed to manage them in the public interest
  • the system was meant to identify not just the corporations who were doing ill in America, but it was meant to identify a federal state that was birthed in the optimism of the New Deal and had been corrupted. So you have this fissure within the Democratic Party itself.
  • The other element of this is this profound search for personal freedom and autonomy that was intensely felt by members of the New Left.
  • The computers were these enormous machines, mainframes, and they were seen as stultifying to human creativity. The personal computer movement was born on — as part of the New Left. Steve Jobs, Stewart Brand imagined a personal computer that would be free of the IBM mainframe, free of big corporations, big corporate power — that it would be the authentic voice of only every individual who would be using that machine.
  • It was a profound expression of a desire for personal autonomy, individuality, expressiveness — unconstrained by larger structures. This cry, or cri de coeur, came from the left. It was a very powerful part of the New Lef
  • ne can see how it might suit the purposes of a rising neoliberal order because the rising neoliberal order was also intent on deregulating, freeing individuals from the grip of large institutions and allowing them to go their own way.
  • Neoliberals believe that the best economic program is one that frees capitalism from its shackles, that allows people to truck, barter and exchange goods, that gets the government out of economic life. And the only role for government is to ensure that markets can function freely and robustly. So it runs opposite to the New Deal. If the core principle of the New Deal was: Capitalism left to its own devices would destroy itself. The core principle of neoliberalism: Remove the shackles from capitalism. That will bring us the most productive and freest world we can imagine.
  • I have a shorthand for describing the neoliberal world that was envisioned by neoliberal thinkers and brought by policymakers into existence. It’s what I sometimes call the four freedoms of neoliberalism: freedom of movement, people; freedom of goods to move across national boundaries; the free flow of information; and the free flow of capital across all boundaries.
  • In a perfect neoliberal world, people, goods, information and capital are moving freely without constraint. If we can imagine a perfect world that The Wall Street Journal wants, this would be pretty close to it.
  • I do not want to suggest for a moment that the New Left intentionally created neoliberalism. But it turned out that the cries of freedom, personal freedom, personal autonomy that were emanating from them turned out to be very conducive to the economic philosophy of neoliberalism.
  • Jimmy Carter is an heir to suspicion of excessive federal power. But I also think he’s grasping at this moment a point of transition in the American economy and a sense that government policy as set forth in the New Deal was not working as well as it should have been. I think it mattered that he was an engineer and he was doing a lot of cost-benefit analysis: What kind of yield are we getting for the bucks that we’re investing?
  • so he’s open to this fertile moment of dissent. He’s channeling new thinkers and imagining a different Democratic Party that you are correct in saying precedes Clinton by 20 years. And the key figure in this movement is a man by the name of Ralph Nader.
  • I think as I evaluate the Carter presidency, I see a man really caught in the throes of a moment of transition, able to glimpse what is coming but unable to master what is coming
  • what defines his presidency, for me, is uncertainty, vacillation and, thus, failure. He’s a classical transitional figure, more controlled by than in charge of the moment.
  • Nader is a man of the left, but he doesn’t fit in the old left or the New Left.
  • We might call him a man of the consumer left. For him, the key figure in American society was the consumer, and he wanted to champion the consumer. And his contributions — in terms of automobile safety, occupational safety, food safety — were immens
  • But he also executed a profound shift in ideology, and I’m not even sure how aware he was of the consequences of what he was generating. Because in the process of making the consumer sovereign, he deflected attention, I would say, from what was and what remains the core relationship in a capitalist economy, and that is in the realm of production and the relations between employers and employees
  • And he was reluctant, in some respects, to challenge corporate power if corporate power was serving the consumer in a good way. He anticipates, in some respects, a profound shift in antitrust policy, and the key figure in this is going to be Robert Bork in the 1980s and 1990s.
  • It had been an article of faith in American history that no corporation should be allowed to get too large, because they would inevitably exercise power in an undemocratic fashion. So antitrust meant breaking up big corporations. Under Robert Bork, the question changed. Big corporate power was OK as long as it served the consumer with cheap goods.
  • he and his supporters and his organizations deserve a lot of credit for holding the government accountable and making vast improvements in a whole host of areas — regulating the environment and other matters, regulating food — and compelling government to do the service that it does.
  • But it also distracts from understanding part of that which powers the rise of large corporations and gives them the ability to control government and capture regulatory agencies. And I think the results of his attacks on government have been ambivalent, in terms of their consequences: in some respects really accelerating the process of delivering goods to the American people and American consumers that they want but, on the other hand, contributing to an atmosphere of thinking the government can’t really do much that’s right.
  • As you move toward Reagan, certainly part of Ronald Reagan’s appeal is his anti-Communism.So how do you describe the role of the Soviet Union in this period of political time?
  • The collapse of the Soviet Union between 1989 and 1991 is one of the most stunning events, I think, of the 20th century and arguably much longer.
  • What were its consequences? First, it opened up the whole globe to capitalist penetration, to a degree that had not been available to capitalism since prior to World War I. And this generates a tremendous amount of belief and excitement and expansion and a good deal of arrogance and hubris within the capitalist citadel, which is the United States. So that’s one major consequence.
  • The second major consequence is: What does it mean for Communism no longer to exist as a threat? And what we begin to see in the 1990s is capital in America regaining the power, assurance, authority, belief in its unilateral power that it had, across the years of the Cold War, if not sacrificed, then moderated.
  • hat the Soviet Union had promised, what Communism had promised, was that private enterprise could be superseded by rational planning on the part of an enlightened set of rulers who could manage the economy in a way that benefited the masses in extraordinary ways.
  • That whole project fails, and it fails in a spectacular fashion.
  • Ronald Reagan had insisted that there was a continuum between Soviet government tyranny and what he regarded as New Deal government tyranny. They were on the same spectrum. One inevitably led to another. He and other Republicans, George H.W. Bush, the party as a whole take this as a great vindication of their core beliefs: that capitalism, which, under the New Deal, was sharply constrained, should be freed from constraint; its animal spirits allowed to soar; venture capitalists encouraged to go everywhere; investments made easy; lower taxation; let capitalists and capital drive America and the world economy, unconstrained by regulation.
  • these were the core ideas of neoliberals, which have been incubating for decades. And now suddenly these ideas seem to be vindicated. This is the moment of free market triumph.
  • it intersects in a very powerful way with the ongoing I.T. revolution, which is also bound up with the Soviet Union’s collapse. Because the Soviet Union was very hostile to the personal computer because it required a degree, at that time, of personal freedom that the Soviet Union wasn’t willing to allow what the I.T. revolution represented in the 1990s. And this is one of the reasons that Democrats get on board with it. What it represented was a belief that market perfection was now within human grasp, that there may have been a need for strong government in the past, because knowledge about markets was imperfect, it was limited, it took time for information about markets to travel, a lot of it was wrong, not enough of it was available instantaneously.
  • Well, suddenly in the 1990s, you have this dream, this vision of all economic knowledge in the world being available at your fingertips instantaneously and with a degree of depth and a range of statistics and figures that had been unimaginable, and a techno-utopianism takes hold
  • it’s the intersection of these two vectors — a sense that the collapse of the Soviet Union vindicates free market thinking and the I.T. revolution — that allows people to think market perfection is within our grasp in ways it never has been before, that pours fuel on the fire of neoliberal free market thinking.
  • You described Bill Clinton as the Dwight D. Eisenhower of neoliberalism. What do you mean by that, and what are some of the, for you, core examples?
  • When Bill Clinton was elected in 1992, no Democratic U.S. president had been elected since 1976. Sixteen years is an eternity in electoral politics in the United States. And the question becomes: Will he roll back the Reagan revolution of the 1980s — massive efforts at deregulation — or will he follow a path that Dwight Eisenhower followed in the early ’50s?
  • Clinton, in the beginning, is a little uncertain about what he is going to do. And he has some ambitious proposals in his first two years — most notably a vast program of national health insurance, which crashes spectacularly.
  • And then he gets punished for that venture severely in the 1994 congressional elections, which bring Newt Gingrich and a very right-wing group of Republicans to power — the first time that Republicans control both houses of Congress since 1952. It’s a huge achievement for the Republicans
  • Clinton reads that moment as signifying that the older Democratic Party of the New Deal, of Franklin Roosevelt and Lyndon Johnson, really had to be reworked and revamped.
  • the only way for him to win re-election, and the only way for the Democrats to hold on to national power and to regain it in Congress in 1996, is for him to acquiesce to some core Reaganite beliefs. And at the center of the Reaganite project was deregulation — which is a code word for getting the government out of economic affairs or curtailing government power.
  • Archived clip of President Bill Clinton: We know big government does not have all the answers. We know there’s not a program for every problem. We know and we have worked to give the American people a smaller, less bureaucratic government in Washington. And we have to give the American people one that lives within its means. The era of big government is over.
  • so Clinton signs off on the Telecommunications Act of 1996, which effectively deregulates the burgeoning I.T. sector of the economy, makes possible an unregulated internet. He signs off on the repeal of the Glass-Steagall Act in 1999.
  • The Glass-Steagall Act had divided investment from commercial banking and had imposed a kind of regulation on Wall Street that brought an end to the crazy speculation that had brought about the Great Depression in the first place. It was a core principle of the New Deal
  • He does not seek to revive the Fairness Doctrine, in terms of regulating public media, which had guided successive Democratic administrations: the idea that if a news outlet put out one side of a debate on a policy matter, they were obligated to give the other side equal access.
  • He becomes an advocate of deregulation and, in some respects, pushes deregulation further than Reagan himself had been able to do. And in that sense, he acquiesces to some of the core principles of the Reagan revolution rather than seeking to roll them back, and it is in that respect that I think it’s appropriate to think of him as a Democratic Eisenhower.
  • what one remembers most about those battles is how much Clinton and Newt Gingrich hated each other’s guts. And they were seen as being polar opposites.
  • Clinton, the representative of a New Left America: cosmopolitan, open to the liberation movements, looking for new ways of creating a new and diverse America, embracing sexual liberations — his embrace of gay rights was somewhat limited but still significant. Newt Gingrich, on the other hand, representing traditional Victorian America, wanting to reassert the patriarchal, heterosexual family, men at work, women in the home, religious.
  • one of the surprises, to me, in working on this book, because I remember those days very well, was the degree to which they worked together — on telecommunication, on reform of Wall Street, on welfare.
  • Clinton would claim, and his defenders would claim, that he was triangulating. He was trying to make the best of a bad deal, that popular opinion was running with free markets, was running with the Republicans. And to some extent, that was true.
  • the lesson that I draw from that moment is that one must refrain from always getting sucked into the daily battles over cultural issues.
  • “cosmopolitanism.” Something that was fresh, to me, in your book was this argument that in neoliberalism, you’re looking at more than just what we typically think of it as, which is an economic theory. You argue that there is a moral ethic that came alongside it, that is part of it. You talk about it as, at various times, cosmopolitan, individualistic. Tell me about it.
  • “Neoliberalism” is often defined, as you say, simply as being about markets and freeing them up
  • And “neoliberalism” is also defined as something that’s profoundly elitist in orientation, and it’s a device and an ideology used by elites to implant market ideology on a society in ways that deepens economic inequality and has the ability to strangle the democratic rights of the masses.
  • I also say that in America, it had a profound popular base. Reagan was an enormously successful president, and by “success,” I mean he was able to excite the imagination of majorities of American voters, and his core message was freedom.
  • half the time he meant freedom in terms of a free enterprise economy, but the other half of the time he meant freedom in terms of giving individuals the autonomy to go their own way.
  • he was not a fan of the liberation movements of the ’60s. But when Clinton becomes president in the 1990s, he has a profound connection to those liberation movements of the 1960s — to feminism, to sexual liberation, to civil rights.
  • he detects in a world in which everyone can travel to wherever they want to go. He valorizes immigrants. He valorizes diversity. These are all values that are profoundly compatible with the neoliberal vision. The opportunity to travel anywhere, to seek out personal adventure, to seek out different cultures.
  • This is a world that neoliberalism makes possible, and it’s a thrilling moment for many people who have the opportunity either to mix in the world of American cities, which have filled up with immigrants, or to travel abroad and experience other cultures.
  • A single global marketplace enables and encourages the kind of cosmopolitanism that people on the left-center side of the political spectrum in America have so deeply valued.
  • you locate the end of this era in the financial crisis of 2008 and 2009. Why?
  • The promise of neoliberalism was that it would lift all boats. There was an acknowledgment about those who were freeing the energies of the market economy that it would probably increase inequality, the distance between the rich and the poor, but that the increase in inequality wouldn’t matter because the forces of production that would be unleashed on a global scale would be so powerful and so profound that everybody would have more and everybody would have a better life.
  • And what the 2008-9 financial crisis exposed was first a lot of the market freedom that neoliberalism had unleashed had led to corrupt banking and financial practices that had brought the world to the edge of financial abyss of unimaginable proportions. We ended up skirting that abyss — but not by a lot.
  • on the other hand, it brought into view a sense of how profoundly unequal the access to power was under the neoliberal regime. And here it’s not so much the financial crash itself but the nature of what governments did to promote recovery from the financial crash.
  • The object in the U.S. and also in Europe became to save the banks first. The culprits of this financial crisis were the ones who were bailed out first. If you were an American in 2009, 2010, 2011, who had assets in the stock market, you had pretty much recovered your position by 2011, 2012. If you were not one of those fortunate Americans and you were living week to week on a paycheck, your recovery did not occur.
  • You didn’t reach pre-2008 levels until 2016, 2017, 2018, and people understood, profoundly, the inequality of recovery, and it caused them to look with a much more scrutinizing gaze at the inequalities that neoliberalism had generated and how those inequalities have become so embedded in government policy toward the rich and the poor.
  • one of the identity crises in the Republican Party — one reason the Republican Party is not held together better — is that the Soviet Union was fundamental to what made its various factions stay in place. And it was also, I think, fundamental to what kept the Republican Party, which at its core has a real anti-government streak, committed in any way to real government.
  • hen I think there’s a sort of casting about for another enemy. I think they end up finding it after 9/11, or think they have, in what they try to turn into global jihadism, and then it falls apart — both as the antagonist and as a project and just feels to me like another part of the sort of wreckage of this period that opens a way for something new.
  • That new thing, I think, is more Donald Trump than it is anything else.
  • I think it discredits what had been a core project of the Republican Party, which was to spread market freedom everywhere. When I teach the Iraq war, I tell my 20-year-old students that this is the worst foreign policy mistake in U.S. history, that it’s going to take the U.S. and the world 50 years to recover from. And it’s imbued with a neoliberal hubris that everyone in the world is simply waiting for the wonders of a market economy to unleash, to be unleashed upon them.
  • OK, if that era ended, what is being born?
  • there’s also new zones of agreement that have emerged. When I think about the way I covered politics in 2010, the legitimacy of elections could be taken for granted, and the legitimacy of the Affordable Care Act could not.
  • I think it’s useful in this moment of acute polarization to look at some of what lies beneath the polarization.
  • you’re right: On a series of issues there are intriguing conversations going on between Democrats and Republicans. China and tariffs are one area of agreement.
  • Ironically, immigration is becoming another area of agreement, regardless of who wins the election. One can imagine that the bill agreed to in the Senate late in 2023 could easily be implemented in some form.
  • here is an area of convergence on antitrust. Josh Hawley and Lena Khan seem to like each other and are finding some common ground on that. And the national security hawks in the G.O.P., people like Marco Rubio and Mitch McConnell, have converged with what we might call the industrial policy doves in the Democratic Party — people like Bernie Sanders — on the importance of reshoring critical sectors of manufacturing and on improving in dramatic ways the nation’s infrastructure.
  • we can see here a new political economy taking shape, one that breaks with the central principle of neoliberalism, which is that markets must lead and the only role for a state is to facilitate markets.
  • another element of that, which has been crucial to the ideological reorientation, is a new understanding of the relationship of free markets to democracy.
  • for the longest period of time, Americans and Europeans were willing to give China a blank check on their democracy, or on their violations of democracy, because of the belief that if market freedom and capitalist practices set down deep enough roots in China that people with economic freedom would want to add to that political freedom and that democracy would begin to flourish and that the Communist Party that rules China would either have to profoundly reform itself or see itself ushered from the political stage.
  • It’s hard to convince people now of how deeply rooted that belief was. No one in the Democratic or Republican Parties believes that anymore, and that has intensified the fear along with this “Oh, my God” sense that China is not simply producing ordinary goods. It’s producing very sophisticated goods. It’s cornering markets on electrical vehicles and batteries and solar panels that seemed unimaginable 15 or 20 years ago. And it has had the effect of profoundly shocking both parties.
  • that has completely transformed and the word “protectionism” is not being used because it’s such a negative term, but the sentiments that lie behind protectionism, which might be described more positively as fair trade, are profoundly with us and shape conversation about U.S. economic relations with China every day of the week.
  • So the change has been profound in both parties, and one of the surprises of the Biden administration, although in retrospect, it’s not so surprising, given the Biden administration’s commitment to industrial policy, is the continuity we see between Trump tariffs and Biden tariffs.
  • hey’ve also come, in many cases, to the view that we should have much more industrial policy: the sense that if you leave it to the market, China might, by using the government to foster and supercharge certain kinds of market pursuits in China, just lap us. I think it’s become the dominant view in both parties.
  • I would agree with that, although I think the Republican Party is probably more deeply split on this than the Democratic Party is. The Democratic Party arranged another kind of grand compromise between the left, represented by Bernie Sanders, and the center, represented by Joe Biden, which led to a profound commitment symbolized by Build Back Better, a $5 trillion project that was going to insert industrial policy into the heart of government economic relations in a way that marks the Biden administration as profoundly different from his Democratic predecessors, both Obama and Clinton.
  • I think the Republican Party does not have agreement on that to the same degree. And one of the interesting things to watch if Trump wins is how that internal fight in the Republican Party works itself out.
  • So the sort of ideological strain in the Republican Party that JD Vance is part of, this sort of more populist dimension of it: What they see markets and, particularly, free trade and trade with China and immigration as having violated is the strength of communities and families. They look around, and they see broken communities, hollowed-out communities.They see families where the male breadwinners have lost their jobs and lost their earning power, and so they’re not getting married, and there are divorces, and there are too many single-parent families
  • on the Democratic side, I think there’s some of the same views. There’s a lot of broken communities.
  • a huge part participant in this ideologically is climate change: the sense that markets would happily make people rich by cooking the planet. The market doesn’t know if the money is coming from, the profit is coming from, burning oil or laying down solar panels. And so once again, that some goal actually does need to be set. Markets can maybe serve our goals. They can serve our vision, but they can’t be assumed to get what we want right in the world.
  • And so the sense on both parties that you actually do need to define goals and define vision and that, ultimately, that is going to have to happen through government setting policy and making decisions — the primacy of that kind of dialogue now, the degree to which the first conversation is: What are we trying to achieve? That does feel different.
  • that speaks to the decisive nature of the election of 2016, which we will see the longer we get from it as a decisive inflection point, as really marking the end of the neoliberal order
  • It doesn’t mean that suddenly there are no more advocates of strong free markets. I think one of the questions now and one of the key questions for the Republican Party is: Can they get serious about this?
  • It requires them to have a serious program of political economy in a party that has lacked direction on political economy for quite some time.
  • You describe the sort of neoliberal era as bringing this much more cosmopolitan view of ethics, of morals and of America’s relationship with the world — a more sort of urbanist view. There’s a lot of connections between what it means to live in New York and to live in London and to live in Tokyo and to live in Hong Kong.
  • JD Vance is a good example of this — are much more skeptical of the individualistic moral structure that dominated here and that Republicans, for all the influence of the Christian right, largely left untouched.
  • it’s actually very complicated in both parties because Donald Trump is himself such a poor vehicle for a return of traditionalist virtue. But there is something happening here, a sort of questioning of not just government policy and industrial policy but: Did all this individualism work? Is a world where kids are on their smartphones all the time and families are having this much trouble — and did we get something more fundamental, almost spiritual, wrong?
  • he concern about the moral fiber of the American people is not new in the Republican Party. That goes back to Jerry Falwell, to some of the ministers who became popular in the 1990s and calling America back to moral virtue and identifying enemies of God.
  • The new element is a sense that one has to connect that concern for this kind of morality to a serious program of political economy, that it’s not enough simply to call on people to be virtuous.
  • t serious conservatives have to find a way to rebuild the economic foundation that lies at the root of so much immorality and so much despair in American life.
  • If that develops enough of a base in the Republican Party, then there becomes an opportunity to talk with Democrats about that, about family welfare, about the welfare of children, about creating institutions, both economic and social, that have the capacity to sustain communities in ways in which they have not been sustained.
  • There are some issues that run so deeply on questions of morality between Republicans and Democrats, it’s hard to see how they can find common ground. And probably the most important of these is on the question of abortion and reproductive rights. And to the extent to which JD Vance and his associates take their stand on this issue, the possibilities for developing a conversation about morality with liberals and Democrats are going to be very, very slim, indeed.
  • the things that I think would have once been framed in terms of Christianity are now framed in terms of classical virtue. There’s a sort of rediscovery of the Stoics, not the early Christians.
  • there’s something here where — obviously, efforts to remoralize America are not new — but this idea that we have gone wrong in modernity by becoming so individualistic seems to be gathering a fair amount of force.
  • My read of it is that the Christian right is just too weak and not sufficiently appealing to be the vehicle for it. And so these other aesthetic and ancient containers are being searched for, but there is some kind of pushback happening
  • I think you see a lot of interest among people in both parties around some of these tech regulations. But I think of that as sort of fundamentally moralistic.
  • he Christian right has become somewhat contaminated by its blind adherence to Trump and by its too great a willingness to plunge into politics with any messenger, no matter what moral qualities they’re exhibiting.
  • That there is a movement among conservatives to step back from that and to ground their morality in something deeper, more widespread, something that can appeal to a greater cross-section of Americans, regardless of whether they go to church or not
  • If there is a moral awakening underway that is not tied to instrumentalizing churches for strictly partisan purposes, which is one way of describing evangelicalism in the last 20, 25 years, then that would be new.
  • Sarah Igo, “The Known Citizen” — very different kind of book — “A History of Privacy in Modern America.” We’re talking about morality, we’re talking about community, and of course, social media has put the question of privacy and what constitutes privacy and what’s private and what’s public — such an urgent question in understanding America. And she gives us a wonderful hundred-year overview of how Americans in almost every generation have redefined the boundary between private and public, and I found that extremely useful in thinking about where America is at in the 21st century.
Javier E

When the New York Times lost its way - 0 views

  • There are many reasons for Trump’s ascent, but changes in the American news media played a critical role. Trump’s manipulation and every one of his political lies became more powerful because journalists had forfeited what had always been most valuable about their work: their credibility as arbiters of truth and brokers of ideas, which for more than a century, despite all of journalism’s flaws and failures, had been a bulwark of how Americans govern themselves.
  • I think Sulzberger shares this analysis. In interviews and his own writings, including an essay earlier this year for the Columbia Journalism Review, he has defended “independent journalism”, or, as I understand him, fair-minded, truth-seeking journalism that aspires to be open and objective.
  • It’s good to hear the publisher speak up in defence of such values, some of which have fallen out of fashion not just with journalists at the Times and other mainstream publications but at some of the most prestigious schools of journalism.
  • ...204 more annotations...
  • All the empathy and humility in the world will not mean much against the pressures of intolerance and tribalism without an invaluable quality that Sulzberger did not emphasise: courage.
  • Sulzberger seems to underestimate the struggle he is in, that all journalism and indeed America itself is in
  • In describing the essential qualities of independent journalism in his essay, he unspooled a list of admirable traits – empathy, humility, curiosity and so forth. These qualities have for generations been helpful in contending with the Times’s familiar problem, which is liberal bias
  • on their own, these qualities have no chance against the Times’s new, more dangerous problem, which is in crucial respects the opposite of the old one.
  • The Times’s problem has metastasised from liberal bias to illiberal bias, from an inclination to favour one side of the national debate to an impulse to shut debate down altogether
  • the internet knocked the industry off its foundations. Local newspapers were the proving ground between college campuses and national newsrooms. As they disintegrated, the national news media lost a source of seasoned reporters and many Americans lost a journalism whose truth they could verify with their own eyes.
  • far more than when I set out to become a journalist, doing the work right today demands a particular kind of courage:
  • the moral and intellectual courage to take the other side seriously and to report truths and ideas that your own side demonises for fear they will harm its cause.
  • One of the glories of embracing illiberalism is that, like Trump, you are always right about everything, and so you are justified in shouting disagreement down.
  • leaders of many workplaces and boardrooms across America find that it is so much easier to compromise than to confront – to give a little ground today in the belief you can ultimately bring people around
  • This is how reasonable Republican leaders lost control of their party to Trump and how liberal-minded college presidents lost control of their campuses. And it is why the leadership of the New York Times is losing control of its principles.
  • Over the decades the Times and other mainstream news organisations failed plenty of times to live up to their commitments to integrity and open-mindedness. The relentless struggle against biases and preconceptions, rather than the achievement of a superhuman objective omniscience, is what mattered
  • . I thought, and still think, that no American institution could have a better chance than the Times, by virtue of its principles, its history, its people and its hold on the attention of influential Americans, to lead the resistance to the corruption of political and intellectual life, to overcome the encroaching dogmatism and intolerance.
  • As the country became more polarised, the national media followed the money by serving partisan audiences the versions of reality they preferred
  • This relationship proved self-reinforcing. As Americans became freer to choose among alternative versions of reality, their polarisation intensified.
  • as the top editors let bias creep into certain areas of coverage, such as culture, lifestyle and business, that made the core harder to defend and undermined the authority of even the best reporters.
  • here have been signs the Times is trying to recover the courage of its convictions
  • The paper was slow to display much curiosity about the hard question of the proper medical protocols for trans children; but once it did, the editors defended their coverage against the inevitable criticism.
  • As Sulzberger told me in the past, returning to the old standards will require agonising change. He saw that as the gradual work of many years, but I think he is mistaken. To overcome the cultural and commercial pressures the Times faces, particularly given the severe test posed by another Trump candidacy and possible presidency, its publisher and senior editors will have to be bolder than that.
  • As a Democrat from a family of Democrats, a graduate of Yale and a blossom of the imagined meritocracy, I had my first real chance, at Buchanan’s rallies, to see the world through the eyes of stalwart opponents of abortion, immigration and the relentlessly rising tide of modernity.
  • the Times is failing to face up to one crucial reason: that it has lost faith in Americans, too.
  • For now, to assert that the Times plays by the same rules it always has is to commit a hypocrisy that is transparent to conservatives, dangerous to liberals and bad for the country as a whole.
  • It makes the Times too easy for conservatives to dismiss and too easy for progressives to believe.
  • The reality is that the Times is becoming the publication through which America’s progressive elite talks to itself about an America that does not really exist.
  • It is hard to imagine a path back to saner American politics that does not traverse a common ground of shared fact.
  • It is equally hard to imagine how America’s diversity can continue to be a source of strength, rather than become a fatal flaw, if Americans are afraid or unwilling to listen to each other.
  • I suppose it is also pretty grandiose to think you might help fix all that. But that hope, to me, is what makes journalism worth doing.
  • Since Adolph Ochs bought the paper in 1896, one of the most inspiring things the Times has said about itself is that it does its work “without fear or favour”. That is not true of the institution today – it cannot be, not when its journalists are afraid to trust readers with a mainstream conservative argument such as Cotton’s, and its leaders are afraid to say otherwise.
  • Most important, the Times, probably more than any other American institution, could influence the way society approached debate and engagement with opposing views. If Times Opinion demonstrated the same kind of intellectual courage and curiosity that my colleagues at the Atlantic had shown, I hoped, the rest of the media would follow.
  • You did not have to go along with everything that any tribe said. You did not have to pretend that the good guys, much as you might have respected them, were right about everything, or that the bad guys, much as you might have disdained them, never had a point. You did not, in other words, ever have to lie.
  • This fundamental honesty was vital for readers, because it equipped them to make better, more informed judgments about the world. Sometimes it might shock or upset them by failing to conform to their picture of reality. But it also granted them the respect of acknowledging that they were able to work things out for themselves.
  • The Atlantic did not aspire to the same role as the Times. It did not promise to serve up the news of the day without any bias. But it was to opinion journalism what the Times’s reporting was supposed to be to news: honest and open to the world.
  • Those were the glory days of the blog, and we hit on the idea of creating a living op-ed page, a collective of bloggers with different points of view but a shared intellectual honesty who would argue out the meaning of the news of the day
  • They were brilliant, gutsy writers, and their disagreements were deep enough that I used to joke that my main work as editor was to prevent fistfights.
  • Under its owner, David Bradley, my colleagues and I distilled our purpose as publishing big arguments about big ideas
  • we also began producing some of the most important work in American journalism: Nicholas Carr on whether Google was “making us stupid”; Hanna Rosin on “the end of men”; Taylor Branch on “the shame of college sports”; Ta-Nehisi Coates on “the case for reparations”; Greg Lukianoff and Jonathan Haidt on “the coddling of the American mind”.
  • I was starting to see some effects of the new campus politics within the Atlantic. A promising new editor had created a digital form for aspiring freelancers to fill out, and she wanted to ask them to disclose their racial and sexual identity. Why? Because, she said, if we were to write about the trans community, for example, we would ask a trans person to write the story
  • There was a good argument for that, I acknowledged, and it sometimes might be the right answer. But as I thought about the old people, auto workers and abortion opponents I had learned from, I told her there was also an argument for correspondents who brought an outsider’s ignorance, along with curiosity and empathy, to the story.
  • A journalism that starts out assuming it knows the answers, it seemed to me then, and seems even more so to me now, can be far less valuable to the reader than a journalism that starts out with a humbling awareness that it knows nothing.
  • In the age of the internet it is hard even for a child to sustain an “innocent eye”, but the alternative for journalists remains as dangerous as ever, to become propagandists. America has more than enough of those already.
  • When I looked around the Opinion department, change was not what I perceived. Excellent writers and editors were doing excellent work. But the department’s journalism was consumed with politics and foreign affairs in an era when readers were also fascinated by changes in technology, business, science and culture.
  • Fairly quickly, though, I realised two things: first, that if I did my job as I thought it should be done, and as the Sulzbergers said they wanted me to do it, I would be too polarising internally ever to lead the newsroom; second, that I did not want that job, though no one but my wife believed me when I said that.
  • there was a compensating moral and psychological privilege that came with aspiring to journalistic neutrality and open-mindedness, despised as they might understandably be by partisans. Unlike the duelling politicians and advocates of all kinds, unlike the corporate chieftains and their critics, unlike even the sainted non-profit workers, you did not have to pretend things were simpler than they actually were
  • On the right and left, America’s elites now talk within their tribes, and get angry or contemptuous on those occasions when they happen to overhear the other conclave. If they could be coaxed to agree what they were arguing about, and the rules by which they would argue about it, opinion journalism could serve a foundational need of the democracy by fostering diverse and inclusive debate. Who could be against that?
  • The large staff of op-ed editors contained only a couple of women. Although the 11 columnists were individually admirable, only two of them were women and only one was a person of colour
  • Not only did they all focus on politics and foreign affairs, but during the 2016 campaign, no columnist shared, in broad terms, the worldview of the ascendant progressives of the Democratic Party, incarnated by Bernie Sanders. And only two were conservative.
  • This last fact was of particular concern to the elder Sulzberger. He told me the Times needed more conservative voices, and that its own editorial line had become predictably left-wing. “Too many liberals,” read my notes about the Opinion line-up from a meeting I had with him and Mark Thompson, then the chief executive, as I was preparing to rejoin the paper. “Even conservatives are liberals’ idea of a conservative.” The last note I took from that meeting was: “Can’t ignore 150m conservative Americans.”
  • As I knew from my time at the Atlantic, this kind of structural transformation can be frightening and even infuriating for those understandably proud of things as they are. It is hard on everyone
  • experience at the Atlantic also taught me that pursuing new ways of doing journalism in pursuit of venerable institutional principles created enthusiasm for change. I expected that same dynamic to allay concerns at the Times.
  • If Opinion published a wider range of views, it would help frame a set of shared arguments that corresponded to, and drew upon, the set of shared facts coming from the newsroom.
  • New progressive voices were celebrated within the Times. But in contrast to the Wall Street Journal and the Washington Post, conservative voices – even eloquent anti-Trump conservative voices – were despised, regardless of how many leftists might surround them.
  • The Opinion department mocked the paper’s claim to value diversity. It did not have a single black editor
  • Eventually, it sank in that my snotty joke was actually on me: I was the one ignorantly fighting a battle that was already lost. The old liberal embrace of inclusive debate that reflected the country’s breadth of views had given way to a new intolerance for the opinions of roughly half of American voters.
  • Out of naivety or arrogance, I was slow to recognise that at the Times, unlike at the Atlantic, these values were no longer universally accepted, let alone esteemed
  • After the 9/11 attacks, as the bureau chief in Jerusalem, I spent a lot of time in the Gaza Strip interviewing Hamas leaders, recruiters and foot soldiers, trying to understand and describe their murderous ideology. Some readers complained that I was providing a platform for terrorists, but there was never any objection from within the Times.
  • Our role, we knew, was to help readers understand such threats, and this required empathetic – not sympathetic – reporting. This is not an easy distinction but good reporters make it: they learn to understand and communicate the sources and nature of a toxic ideology without justifying it, much less advocating it.
  • Today’s newsroom turns that moral logic on its head, at least when it comes to fellow Americans. Unlike the views of Hamas, the views of many Americans have come to seem dangerous to engage in the absence of explicit condemnation
  • Focusing on potential perpetrators – “platforming” them by explaining rather than judging their views – is believed to empower them to do more harm.
  • After the profile of the Ohio man was published, media Twitter lit up with attacks on the article as “normalising” Nazism and white nationalism, and the Times convulsed internally. The Times wound up publishing a cringing editor’s note that hung the writer out to dry and approvingly quoted some of the criticism, including a tweet from a Washington Post opinion editor asking, “Instead of long, glowing profiles of Nazis/White nationalists, why don’t we profile the victims of their ideologies”?
  • the Times lacked the confidence to defend its own work
  • The editor’s note paraded the principle of publishing such pieces, saying it was important to “shed more light, not less, on the most extreme corners of American life”. But less light is what the readers got. As a reporter in the newsroom, you’d have to have been an idiot after that explosion to attempt such a profile
  • Empathetic reporting about Trump supporters became even more rare. It became a cliché among influential left-wing columnists and editors that blinkered political reporters interviewed a few Trump supporters in diners and came away suckered into thinking there was something besides racism that could explain anyone’s support for the man.
  • After a year spent publishing editorials attacking Trump and his policies, I thought it would be a demonstration of Timesian open-mindedness to give his supporters their say. Also, I thought the letters were interesting, so I turned over the entire editorial page to the Trump letters.
  • I wasn’t surprised that we got some criticism on Twitter. But I was astonished by the fury of my Times colleagues. I found myself facing an angry internal town hall, trying to justify what to me was an obvious journalistic decision
  • Didn’t he think other Times readers should understand the sources of Trump’s support? Didn’t he also see it was a wonderful thing that some Trump supporters did not just dismiss the Times as fake news, but still believed in it enough to respond thoughtfully to an invitation to share their views?
  • And if the Times could not bear to publish the views of Americans who supported Trump, why should it be surprised that those voters would not trust it?
  • Two years later, in 2020, Baquet acknowledged that in 2016 the Times had failed to take seriously the idea that Trump could become president partly because it failed to send its reporters out into America to listen to voters and understand “the turmoil in the country”. And, he continued, the Times still did not understand the views of many Americans
  • Speaking four months before we published the Cotton op-ed, he said that to argue that the views of such voters should not appear in the Times was “not journalistic”.
  • Conservative arguments in the Opinion pages reliably started uproars within the Times. Sometimes I would hear directly from colleagues who had the grace to confront me with their concerns; more often they would take to the company’s Slack channels or Twitter to advertise their distress in front of each other
  • This environment of enforced group-think, inside and outside the paper, was hard even on liberal opinion writers. One left-of-centre columnist told me that he was reluctant to appear in the New York office for fear of being accosted by colleagues.
  • An internal survey shortly after I left the paper found that barely half the staff, within an enterprise ostensibly devoted to telling the truth, agreed “there is a free exchange of views in this company” and “people are not afraid to say what they really think”.)
  • Even columnists with impeccable leftist bona fides recoiled from tackling subjects when their point of view might depart from progressive orthodoxy.
  • The bias had become so pervasive, even in the senior editing ranks of the newsroom, as to be unconscious
  • Trying to be helpful, one of the top newsroom editors urged me to start attaching trigger warnings to pieces by conservatives. It had not occurred to him how this would stigmatise certain colleagues, or what it would say to the world about the Times’s own bias
  • By their nature, information bubbles are powerfully self-reinforcing, and I think many Times staff have little idea how closed their world has become, or how far they are from fulfilling their compact with readers to show the world “without fear or favour”
  • sometimes the bias was explicit: one newsroom editor told me that, because I was publishing more conservatives, he felt he needed to push his own department further to the left.
  • The Times’s failure to honour its own stated principles of openness to a range of views was particularly hard on the handful of conservative writers, some of whom would complain about being flyspecked and abused by colleagues. One day when I relayed a conservative’s concern about double standards to Sulzberger, he lost his patience. He told me to inform the complaining conservative that that’s just how it was: there was a double standard and he should get used to it.
  • A publication that promises its readers to stand apart from politics should not have different standards for different writers based on their politics. But I delivered the message. There are many things I regret about my tenure as editorial-page editor. That is the only act of which I am ashamed.
  • I began to think of myself not as a benighted veteran on a remote island, but as Rip Van Winkle. I had left one newspaper, had a pleasant dream for ten years, and returned to a place I barely recognised.
  • The new New York Times was the product of two shocks – sudden collapse, and then sudden success. The paper almost went bankrupt during the financial crisis, and the ensuing panic provoked a crisis of confidence among its leaders. Digital competitors like the HuffPost were gaining readers and winning plaudits within the media industry as innovative. They were the cool kids; Times folk were ink-stained wrinklies.
  • In its panic, the Times bought out experienced reporters and editors and began hiring journalists from publications like the HuffPost who were considered “digital natives” because they had never worked in print. This hiring quickly became easier, since most digital publications financed by venture capital turned out to be bad businesses
  • Though they might have lacked deep or varied reporting backgrounds, some of the Times’s new hires brought skills in video and audio; others were practised at marketing themselves – building their brands, as journalists now put it – in social media. Some were brilliant and fiercely honest, in keeping with the old aspirations of the paper.
  • critically, the Times abandoned its practice of acculturation, including those months-long assignments on Metro covering cops and crime or housing. Many new hires who never spent time in the streets went straight into senior writing and editing roles.
  • All these recruits arrived with their own notions of the purpose of the Times. To me, publishing conservatives helped fulfil the paper’s mission; to them, I think, it betrayed that mission.
  • then, to the shock and horror of the newsroom, Trump won the presidency. In his article for Columbia Journalism Review, Sulzberger cites the Times’s failure to take Trump’s chances seriously as an example of how “prematurely shutting down inquiry and debate” can allow “conventional wisdom to ossify in a way that blinds society.
  • Many Times staff members – scared, angry – assumed the Times was supposed to help lead the resistance. Anxious for growth, the Times’s marketing team implicitly endorsed that idea, too.
  • As the number of subscribers ballooned, the marketing department tracked their expectations, and came to a nuanced conclusion. More than 95% of Times subscribers described themselves as Democrats or independents, and a vast majority of them believed the Times was also liberal
  • A similar majority applauded that bias; it had become “a selling point”, reported one internal marketing memo. Yet at the same time, the marketers concluded, subscribers wanted to believe that the Times was independent.
  • As that memo argued, even if the Times was seen as politically to the left, it was critical to its brand also to be seen as broadening its readers’ horizons, and that required “a perception of independence”.
  • Readers could cancel their subscriptions if the Times challenged their worldview by reporting the truth without regard to politics. As a result, the Times’s long-term civic value was coming into conflict with the paper’s short-term shareholder value
  • The Times has every right to pursue the commercial strategy that makes it the most money. But leaning into a partisan audience creates a powerful dynamic. Nobody warned the new subscribers to the Times that it might disappoint them by reporting truths that conflicted with their expectations
  • When your product is “independent journalism”, that commercial strategy is tricky, because too much independence might alienate your audience, while too little can lead to charges of hypocrisy that strike at the heart of the brand.
  • It became one of Dean Baquet’s frequent mordant jokes that he missed the old advertising-based business model, because, compared with subscribers, advertisers felt so much less sense of ownership over the journalism
  • The Times was slow to break it to its readers that there was less to Trump’s ties to Russia than they were hoping, and more to Hunter Biden’s laptop, that Trump might be right that covid came from a Chinese lab, that masks were not always effective against the virus, that shutting down schools for many months was a bad idea.
  • there has been a sea change over the past ten years in how journalists think about pursuing justice. The reporters’ creed used to have its foundation in liberalism, in the classic philosophical sense. The exercise of a reporter’s curiosity and empathy, given scope by the constitutional protections of free speech, would equip readers with the best information to form their own judgments. The best ideas and arguments would win out
  • The journalist’s role was to be a sworn witness; the readers’ role was to be judge and jury. In its idealised form, journalism was lonely, prickly, unpopular work, because it was only through unrelenting scepticism and questioning that society could advance. If everyone the reporter knew thought X, the reporter’s role was to ask: why X?
  • Illiberal journalists have a different philosophy, and they have their reasons for it. They are more concerned with group rights than individual rights, which they regard as a bulwark for the privileges of white men. They have seen the principle of  free speech used to protect right-wing outfits like Project Veritas and Breitbart News and are uneasy with it.
  • They had their suspicions of their fellow citizens’ judgment confirmed by Trump’s election, and do not believe readers can be trusted with potentially dangerous ideas or facts. They are not out to achieve social justice as the knock-on effect of pursuing truth; they want to pursue it head-on
  • The term “objectivity” to them is code for ignoring the poor and weak and cosying up to power, as journalists often have done.
  • And they do not just want to be part of the cool crowd. They need to be
  • To be more valued by their peers and their contacts – and hold sway over their bosses – they need a lot of followers in social media. That means they must be seen to applaud the right sentiments of the right people in social media
  • The journalist from central casting used to be a loner, contrarian or a misfit. Now journalism is becoming another job for joiners, or, to borrow Twitter’s own parlance, “followers”, a term that mocks the essence of a journalist’s role.
  • The new newsroom ideology seems idealistic, yet it has grown from cynical roots in academia: from the idea that there is no such thing as objective truth; that there is only narrative, and that therefore whoever controls the narrative – whoever gets to tell the version of the story that the public hears – has the whip hand
  • What matters, in other words, is not truth and ideas in themselves, but the power to determine both in the public mind.
  • By contrast, the old newsroom ideology seems cynical on its surface. It used to bug me that my editors at the Times assumed every word out of the mouth of any person in power was a lie.
  • And the pursuit of objectivity can seem reptilian, even nihilistic, in its abjuration of a fixed position in moral contests. But the basis of that old newsroom approach was idealistic: the notion that power ultimately lies in truth and ideas, and that the citizens of a pluralistic democracy, not leaders of any sort, must be trusted to judge both.
  • Our role in Times Opinion, I used to urge my colleagues, was not to tell people what to think, but to help them fulfil their desire to think for themselves.
  • It seems to me that putting the pursuit of truth, rather than of justice, at the top of a publication’s hierarchy of values also better serves not just truth but justice, too
  • over the long term journalism that is not also sceptical of the advocates of any form of justice and the programmes they put forward, and that does not struggle honestly to understand and explain the sources of resistance,
  • will not assure that those programmes will work, and it also has no legitimate claim to the trust of reasonable people who see the world very differently. Rather than advance understanding and durable change, it provokes backlash.
  • The impatience within the newsroom with such old ways was intensified by the generational failure of the Times to hire and promote women and non-white people
  • Pay attention if you are white at the Times and you will hear black editors speak of hiring consultants at their own expense to figure out how to get white staff to respect them
  • As wave after wave of pain and outrage swept through the Times, over a headline that was not damning enough of Trump or someone’s obnoxious tweets, I came to think of the people who were fragile, the ones who were caught up in Slack or Twitter storms, as people who had only recently discovered that they were white and were still getting over the shock.
  • Having concluded they had got ahead by working hard, it has been a revelation to them that their skin colour was not just part of the wallpaper of American life, but a source of power, protection and advancement.
  • I share the bewilderment that so many people could back Trump, given the things he says and does, and that makes me want to understand why they do: the breadth and diversity of his support suggests not just racism is at work. Yet these elite, well-meaning Times staff cannot seem to stretch the empathy they are learning to extend to people with a different skin colour to include those, of whatever race, who have different politics.
  • The digital natives were nevertheless valuable, not only for their skills but also because they were excited for the Times to embrace its future. That made them important allies of the editorial and business leaders as they sought to shift the Times to digital journalism and to replace staff steeped in the ways of print. Partly for that reason, and partly out of fear, the leadership indulged internal attacks on Times journalism, despite pleas from me and others, to them and the company as a whole, that Times folk should treat each other with more respect
  • My colleagues and I in Opinion came in for a lot of the scorn, but we were not alone. Correspondents in the Washington bureau and political reporters would take a beating, too, when they were seen as committing sins like “false balance” because of the nuance in their stories.
  • My fellow editorial and commercial leaders were well aware of how the culture of the institution had changed. As delighted as they were by the Times’s digital transformation they were not blind to the ideological change that came with it. They were unhappy with the bullying and group-think; we often discussed such cultural problems in the weekly meetings of the executive committee, composed of the top editorial and business leaders, including the publisher. Inevitably, these bitch sessions would end with someone saying a version of: “Well, at some point we have to tell them this is what we believe in as a newspaper, and if they don’t like it they should work somewhere else.” It took me a couple of years to realise that this moment was never going to come.
  • There is a lot not to miss about the days when editors like Boyd could strike terror in young reporters like me and Purdum. But the pendulum has swung so far in the other direction that editors now tremble before their reporters and even their interns. “I miss the old climate of fear,” Baquet used to say with a smile, in another of his barbed jokes.
  • I wish I’d pursued my point and talked myself out of the job. This contest over control of opinion journalism within the Times was not just a bureaucratic turf battle (though it was that, too)
  • The newsroom’s embrace of opinion journalism has compromised the Times’s independence, misled its readers and fostered a culture of intolerance and conformity.
  • The Opinion department is a relic of the era when the Times enforced a line between news and opinion journalism.
  • Editors in the newsroom did not touch opinionated copy, lest they be contaminated by it, and opinion journalists and editors kept largely to their own, distant floor within the Times building. Such fastidiousness could seem excessive, but it enforced an ethos that Times reporters owed their readers an unceasing struggle against bias in the news
  • But by the time I returned as editorial-page editor, more opinion columnists and critics were writing for the newsroom than for Opinion. As at the cable news networks, the boundaries between commentary and news were disappearing, and readers had little reason to trust that Times journalists were resisting rather than indulging their biases
  • The Times newsroom had added more cultural critics, and, as Baquet noted, they were free to opine about politics.
  • Departments across the Times newsroom had also begun appointing their own “columnists”, without stipulating any rules that might distinguish them from columnists in Opinion
  • I checked to see if, since I left the Times, it had developed guidelines explaining the difference, if any, between a news columnist and opinion columnist. The paper’s spokeswoman, Danielle Rhoades Ha, did not respond to the question.)
  • The internet rewards opinionated work and, as news editors felt increasing pressure to generate page views, they began not just hiring more opinion writers but also running their own versions of opinionated essays by outside voices – historically, the province of Opinion’s op-ed department.
  • Yet because the paper continued to honour the letter of its old principles, none of this work could be labelled “opinion” (it still isn’t). After all, it did not come from the Opinion department.
  • And so a newsroom technology columnist might call for, say, unionisation of the Silicon Valley workforce, as one did, or an outside writer might argue in the business section for reparations for slavery, as one did, and to the average reader their work would appear indistinguishable from Times news articles.
  • By similarly circular logic, the newsroom’s opinion journalism breaks another of the Times’s commitments to its readers. Because the newsroom officially does not do opinion – even though it openly hires and publishes opinion journalists – it feels free to ignore Opinion’s mandate to provide a diversity of views
  • When I was editorial-page editor, there were a couple of newsroom columnists whose politics were not obvious. But the other newsroom columnists, and the critics, read as passionate progressives.
  • I urged Baquet several times to add a conservative to the newsroom roster of cultural critics. That would serve the readers by diversifying the Times’s analysis of culture, where the paper’s left-wing bias had become most blatant, and it would show that the newsroom also believed in restoring the Times’s commitment to taking conservatives seriously. He said this was a good idea, but he never acted on it
  • I couldn’t help trying the idea out on one of the paper’s top cultural editors, too: he told me he did not think Times readers would be interested in that point of view.
  • opinion was spreading through the newsroom in other ways. News desks were urging reporters to write in the first person and to use more “voice”, but few newsroom editors had experience in handling that kind of journalism, and no one seemed certain where “voice” stopped and “opinion” began
  • The Times magazine, meanwhile, became a crusading progressive publication
  • Baquet liked to say the magazine was Switzerland, by which he meant that it sat between the newsroom and Opinion. But it reported only to the news side. Its work was not labelled as opinion and it was free to omit conservative viewpoints.
  • his creep of politics into the newsroom’s journalism helped the Times beat back some of its new challengers, at least those on the left
  • Competitors like Vox and the HuffPost were blending leftish politics with reporting and writing it up conversationally in the first person. Imitating their approach, along with hiring some of their staff, helped the Times repel them. But it came at a cost. The rise of opinion journalism over the past 15 years changed the newsroom’s coverage and its culture
  • The tiny redoubt of never-Trump conservatives in Opinion is swamped daily not only by the many progressives in that department but their reinforcements among the critics, columnists and magazine writers in the newsroom
  • They are generally excellent, but their homogeneity means Times readers are being served a very restricted range of views, some of them presented as straight news by a publication that still holds itself out as independent of any politics.
  • And because the critics, newsroom columnists and magazine writers are the newsroom’s most celebrated journalists, they have disproportionate influence over the paper’s culture.
  • By saying that it still holds itself to the old standard of strictly separating its news and opinion journalists, the paper leads its readers further into the trap of thinking that what they are reading is independent and impartial – and this misleads them about their country’s centre of political and cultural gravity.
  • And yet the Times insists to the public that nothing has changed.
  • “Even though each day’s opinion pieces are typically among our most popular journalism and our columnists are among our most trusted voices, we believe opinion is secondary to our primary mission of reporting and should represent only a portion of a healthy news diet,” Sulzberger wrote in the Columbia Journalism Review. “For that reason, we’ve long kept the Opinion department intentionally small – it represents well under a tenth of our journalistic staff – and ensured that its editorial decision-making is walled off from the newsroom.”
  • When I was editorial-page editor, Sulzberger, who declined to be interviewed on the record for this article, worried a great deal about the breakdown in the boundaries between news and opinion
  • He told me once that he would like to restructure the paper to have one editor oversee all its news reporters, another all its opinion journalists and a third all its service journalists, the ones who supply guidance on buying gizmos or travelling abroad. Each of these editors would report to him
  • That is the kind of action the Times needs to take now to confront its hypocrisy and begin restoring its independence.
  • The Times could learn something from the Wall Street Journal, which has kept its journalistic poise
  • It has maintained a stricter separation between its news and opinion journalism, including its cultural criticism, and that has protected the integrity of its work.
  • After I was chased out of the Times, Journal reporters and other staff attempted a similar assault on their opinion department. Some 280 of them signed a letter listing pieces they found offensive and demanding changes in how their opinion colleagues approached their work. “Their anxieties aren’t our responsibility,” shrugged the Journal’s editorial board in a note to readers after the letter was leaked. “The signers report to the news editors or other parts of the business.” The editorial added, in case anyone missed the point, “We are not the New York Times.” That was the end of it.
  • Unlike the publishers of the Journal, however, Sulzberger is in a bind, or at least perceives himself to be
  • The confusion within the Times over its role, and the rising tide of intolerance among the reporters, the engineers, the business staff, even the subscribers – these are all problems he inherited, in more ways than one. He seems to feel constrained in confronting the paper’s illiberalism by the very source of his authority
  • The paradox is that in previous generations the Sulzbergers’ control was the bulwark of the paper’s independence.
  • if he is going to instil the principles he believes in, he needs to stop worrying so much about his powers of persuasion, and start using the power he is so lucky to have.
  • Shortly after we published the op-ed that Wednesday afternoon, some reporters tweeted their opposition to Cotton’s argument. But the real action was in the Times’s Slack channels, where reporters and other staff began not just venting but organising. They turned to the union to draw up a workplace complaint about the op-ed.
  • The next day, this reporter shared the byline on the Times story about the op-ed. That article did not mention that Cotton had distinguished between “peaceful, law-abiding protesters” and “rioters and looters”. In fact, the first sentence reported that Cotton had called for “the military to suppress protests against police violence”.
  • This was – and is – wrong. You don’t have to take my word for that. You can take the Times’s
  • Three days later in its article on my resignation it also initially reported that Cotton had called “for military force against protesters in American cities”. This time, after the article was published on the Times website, the editors scrambled to rewrite it, replacing “military force” with “military response” and “protesters” with “civic unrest”
  • That was a weaselly adjustment – Cotton wrote about criminality, not “unrest” – but the article at least no longer unambiguously misrepresented Cotton’s argument to make it seem he was in favour of crushing democratic protest. The Times did not publish a correction or any note acknowledging the story had been changed.
  • Seeking to influence the outcome of a story you cover, particularly without disclosing that to the reader, violates basic principles I was raised on at the Times
  • s Rhoades Ha disputes my characterisation of the after-the-fact editing of the story about my resignation. She said the editors changed the story after it was published on the website in order to “refine” it and “add context”, and so the story did not merit a correction disclosing to the reader that changes had been made.
  • In retrospect what seems almost comical is that as the conflict over Cotton’s op-ed unfolded within the Times I acted as though it was on the level, as though the staff of the Times would have a good-faith debate about Cotton’s piece and the decision to publish it
  • Instead, people wanted to vent and achieve what they considered to be justice, whether through Twitter, Slack, the union or the news pages themselves
  • My colleagues in Opinion, together with the PR team, put together a series of connected tweets describing the purpose behind publishing Cotton’s op-ed. Rather than publish these tweets from the generic Times Opinion Twitter account, Sulzberger encouraged me to do it from my personal one, on the theory that this would humanise our defence. I doubted that would make any difference, but it was certainly my job to take responsibility. So I sent out the tweets, sticking my head in a Twitter bucket that clangs, occasionally, to this day
  • What is worth recalling now from the bedlam of the next two days? I suppose there might be lessons for someone interested in how not to manage a corporate crisis. I began making my own mistakes that Thursday. The union condemned our publication of Cotton, for supposedly putting journalists in danger, claiming that he had called on the military “to ‘detain’ and ‘subdue’ Americans protesting racism and police brutality” – again, a misrepresentation of his argument. The publisher called to tell me the company was experiencing its largest sick day in history; people were turning down job offers because of the op-ed, and, he said, some people were quitting. He had been expecting for some time that the union would seek a voice in editorial decision-making; he said he thought this was the moment the union was making its move. He had clearly changed his own mind about the value of publishing the Cotton op-ed.
  • I asked Dao to have our fact-checkers review the union’s claims. But then I went a step further: at the publisher’s request, I urged him to review the editing of the piece itself and come back to me with a list of steps we could have taken to make it better. Dao’s reflex – the correct one – was to defend the piece as published. He and three other editors of varying ages, genders and races had helped edit it; it had been fact-checked, as is all our work
  • This was my last failed attempt to have the debate within the Times that I had been seeking for four years, about why it was important to present Times readers with arguments like Cotton’s. The staff at the paper never wanted to have that debate. The Cotton uproar was the most extreme version of the internal reaction we faced whenever we published conservative arguments that were not simply anti-Trump. Yes, yes, of course we believe in the principle of publishing diverse views, my Times colleagues would say, but why this conservative? Why this argument?
  • I doubt these changes would have mattered, and to extract this list from Dao was to engage in precisely the hypocrisy I claimed to despise – that, in fact, I do despise. If Cotton needed to be held to such standards of politesse, so did everyone else. Headlines such as “Tom Cotton’s Fascist Op-ed”, the headline of a subsequent piece, should also have been tranquillised.
  • As that miserable Thursday wore on, Sulzberger, Baquet and I held a series of Zoom meetings with reporters and editors from the newsroom who wanted to discuss the op-ed. Though a handful of the participants were there to posture, these were generally constructive conversations. A couple of people, including Baquet, even had the guts to speak up in favour of publishing the op-ed
  • Two moments stick out. At one point, in answer to a question, Sulzberger and Baquet both said they thought the op-ed – as the Times union and many journalists were saying – had in fact put journalists in danger. That was the first time I realised I might be coming to the end of the road.
  • The other was when a pop-culture reporter asked if I had read the op-ed before it was published. I said I had not. He immediately put his head down and started typing, and I should have paid attention rather than moving on to the next question. He was evidently sharing the news with the company over Slack.
  • Every job review I had at the Times urged me to step back from the daily coverage to focus on the long term. (Hilariously, one review, urging me to move faster in upending the Opinion department, instructed me to take risks and “ask for forgiveness not permission”.)
  • I learned when these meetings were over that there had been a new eruption in Slack. Times staff were saying that Rubenstein had been the sole editor of the op-ed. In response, Dao had gone into Slack to clarify to the entire company that he had also edited it himself. But when the Times posted the news article that evening, it reported, “The Op-Ed was edited by Adam Rubenstein” and made no mention of Dao’s statement
  • Early that morning, I got an email from Sam Dolnick, a Sulzberger cousin and a top editor at the paper, who said he felt “we” – he could have only meant me – owed the whole staff “an apology for appearing to place an abstract idea like open debate over the value of our colleagues’ lives, and their safety”. He was worried that I and my colleagues had unintentionally sent a message to other people at the Times that: “We don’t care about their full humanity and their security as much as we care about our ideas.”
  • “I know you don’t like it when I talk about principles at a moment like this,” I began. But I viewed the journalism I had been doing, at the Times and before that at the Atlantic, in very different terms from the ones Dolnick presumed. “I don’t think of our work as an abstraction without meaning for people’s lives – quite the opposite,” I continued. “The whole point – the reason I do this – is to have an impact on their lives to the good. I have always believed that putting ideas, including potentially dangerous one[s], out in the public is vital to ensuring they are debated and, if dangerous, discarded.” It was, I argued, in “edge cases like this that principles are tested”, and if my position was judged wrong then “I am out of step with the times.” But, I concluded, “I don’t think of us as some kind of debating society without implications for the real world and I’ve never been unmindful of my colleagues’ humanity.”
  • in the end, one thing he and I surely agree on is that I was, in fact, out of step with the Times. It may have raised me as a journalist – and invested so much in educating me to what were once its standards – but I did not belong there any more.
  • Finally, I came up with something that felt true. I told the meeting that I was sorry for the pain that my leadership of Opinion had caused. What a pathetic thing to say. I did not think to add, because I’d lost track of this truth myself by then, that opinion journalism that never causes pain is not journalism. It can’t hope to move society forward
  • As I look back at my notes of that awful day, I don’t regret what I said. Even during that meeting, I was still hoping the blow-up might at last give me the chance either to win support for what I had been asked to do, or to clarify once and for all that the rules for journalism had changed at the Times.
  • But no one wanted to talk about that. Nor did they want to hear about all the voices of vulnerable or underprivileged people we had been showcasing in Opinion, or the ambitious new journalism we were doing. Instead, my Times colleagues demanded to know things such as the names of every editor who had had a role in the Cotton piece. Having seen what happened to Rubenstein I refused to tell them. A Slack channel had been set up to solicit feedback in real time during the meeting, and it was filling with hate. The meeting ran long, and finally came to a close after 90 minutes.
  • I tried to insist, as did Dao, that the note make clear the Cotton piece was within our editorial bounds. Sulzberger said he felt the Times could afford to be “silent” on that question. In the end the note went far further in repudiating the piece than I anticipated, saying it should never have been published at all. The next morning I was told to resign.
  • It was a terrible moment for the country. By the traditional – and perverse – logic of journalism, that should also have made it an inspiring time to be a reporter, writer or editor. Journalists are supposed to run towards scenes that others are fleeing, towards hard truths others need to know, towards consequential ideas they would prefer to ignore.
  • But fear got all mixed up with anger inside the Times, too, along with a desire to act locally in solidarity with the national movement. That energy found a focus in the Cotton op-ed
  • the Times is not good at acknowledging mistakes. Indeed, one of my own, within the Times culture, was to take responsibility for any mistakes my department made, and even some it didn’t
  • To Sulzberger, the meltdown over Cotton’s op-ed and my departure in disgrace are explained and justified by a failure of editorial “process”. As he put it in an interview with the New Yorker this summer, after publishing his piece in the Columbia Journalism Review, Cotton’s piece was not “perfectly fact-checked” and the editors had not “thought about the headline and presentation”. He contrasted the execution of Cotton’s opinion piece with that of a months-long investigation the newsroom did of Donald Trump’s taxes (which was not “perfectly fact-checked”, as it happens – it required a correction). He did not explain why, if the Times was an independent publication, an op-ed making a mainstream conservative argument should have to meet such different standards from an op-ed making any other kind of argument, such as for the abolition of the police
  • “It’s not enough just to have the principle and wave it around,” he said. “You also have to execute on it.”
  • To me, extolling the virtue of independent journalism in the pages of the Columbia Journalism Review is how you wave a principle around. Publishing a piece like Cotton’s is how you execute on it.
  • As Sulzberger also wrote in the Review, “Independent journalism, especially in a pluralistic democracy, should err on the side of treating areas of serious political contest as open, unsettled, and in need of further inquiry.
  • If Sulzberger must insist on comparing the execution of the Cotton op-ed with that of the most ambitious of newsroom projects, let him compare it with something really important, the 1619 Project, which commemorated the 400th anniversary of the arrival of enslaved Africans in Virginia.
  • Like Cotton’s piece, the 1619 Project was fact-checked and copy-edited (most of the Times newsroom does not fact-check or copy-edit articles, but the magazine does). But it nevertheless contained mistakes, as journalism often does. Some of these mistakes ignited a firestorm among historians and other readers.
  • And, like Cotton’s piece, the 1619 Project was presented in a way the Times later judged to be too provocative.
  • The Times declared that the 1619 Project “aims to reframe the country’s history, understanding 1619 as our true founding”. That bold statement – a declaration of Times fact, not opinion, since it came from the newsroom – outraged many Americans who venerated 1776 as the founding. The Times later stealthily erased it from the digital version of the project, but was caught doing so by a writer for the publication Quillette. Sulzberger told me during the initial uproar that the top editors in the newsroom – not just Baquet but his deputy – had not reviewed the audacious statement of purpose, one of the biggest editorial claims the paper has ever made. They also, of course, did not edit all the pieces themselves, trusting the magazine’s editors to do that work.
  • If the 1619 Project and the Cotton op-ed shared the same supposed flaws and excited similar outrage, how come that one is lauded as a landmark success and the other is a sackable offence?
  • I am comparing them only to meet Sulzberger on his terms, in order to illuminate what he is trying to elide. What distinguished the Cotton piece was not an error, or strong language, or that I didn’t edit it personally. What distinguished that op-ed was not process. It was politics.
  • It is one thing for the Times to aggravate historians, or conservatives, or even old-school liberals who believe in open debate. It has become quite another for the Times to challenge some members of its own staff with ideas that might contradict their view of the world.
  • The lessons of the incident are not about how to write a headline but about how much the Times has changed – how digital technology, the paper’s new business model and the rise of new ideals among its staff have altered its understanding of the boundary between news and opinion, and of the relationship between truth and justice
  • Ejecting me was one way to avoid confronting the question of which values the Times is committed to. Waving around the word “process” is another.
  • As he asserts the independence of Times journalism, Sulzberger is finding it necessary to reach back several years to another piece I chose to run, for proof that the Times remains willing to publish views that might offend its staff. “We’ve published a column by the head of the part of the Taliban that kidnapped one of our own journalists,” he told the New Yorker. He is missing the real lesson of that piece, as well.
  • The case against that piece is that Haqqani, who remains on the FBI’s most-wanted terrorist list, may have killed Americans. It’s puzzling: in what moral universe can it be a point of pride to publish a piece by an enemy who may have American blood on his hands, and a matter of shame to publish a piece by an American senator arguing for American troops to protect Americans?
  • As Mitch McConnell, then the majority leader, said on the Senate floor about the Times’s panic over the Cotton op-ed, listing some other debatable op-ed choices, “Vladimir Putin? No problem. Iranian propaganda? Sure. But nothing, nothing could have prepared them for 800 words from the junior senator from Arkansas.”
  • The Times’s staff members are not often troubled by obnoxious views when they are held by foreigners. This is an important reason the paper’s foreign coverage, at least of some regions, remains exceptional.
  • What seems most important and least understood about that episode is that it demonstrated in real time the value of the ideals that I poorly defended in the moment, ideals that not just the Times’s staff but many other college-educated Americans are abandoning.
  • After all, we ran the experiment; we published the piece. Was any Times journalist hurt? No. Nobody in the country was. In fact, though it is impossible to know the op-ed’s precise effect, polling showed that support for a military option dropped after the Times published the essay, as the Washington Post’s media critic, Erik Wemple, has written
  • If anything, in other words, publishing the piece stimulated debate that made it less likely Cotton’s position would prevail. The liberal, journalistic principle of open debate was vindicated in the very moment the Times was fleeing from it.
Javier E

Collapsing Levels of Trust Are Devastating America - The Atlantic - 0 views

  • American history is driven by periodic moments of moral convulsion
  • Harvard political scientist Samuel P. Huntington noticed that these convulsions seem to hit the United States every 60 years or so: the Revolutionary period of the 1760s and ’70s; the Jacksonian uprising of the 1820s and ’30s; the Progressive Era, which began in the 1890s; and the social-protest movements of the 1960s and early ’70s
  • A highly moralistic generation appears on the scene. It uses new modes of communication to seize control of the national conversation. Groups formerly outside of power rise up and take over the system. These are moments of agitation and excitement, frenzy and accusation, mobilization and passion.
  • ...168 more annotations...
  • In 1981, Huntington predicted that the next moral convulsion would hit America around the second or third decade of the 21st century—that is, right about now.
  • Trump is the final instrument of this crisis, but the conditions that brought him to power and make him so dangerous at this moment were decades in the making, and those conditions will not disappear if he is defeated.
  • Social trust is a measure of the moral quality of a society—of whether the people and institutions in it are trustworthy, whether they keep their promises and work for the common g
  • When people in a society lose faith or trust in their institutions and in each other, the nation collapses.
  • This is an account of how, over the past few decades, America became a more untrustworthy society
  • under the stresses of 2020, American institutions and the American social order crumbled and were revealed as more untrustworthy still
  • We had a chance, in crisis, to pull together as a nation and build trust. We did not. That has left us a broken, alienated society caught in a distrust doom loop.
  • The Baby Boomers grew up in the 1950s and ’60s, an era of family stability, widespread prosperity, and cultural cohesion. The mindset they embraced in the late ’60s and have embodied ever since was all about rebelling against authority, unshackling from institutions, and celebrating freedom, individualism, and liberation.
  • The emerging generations today enjoy none of that sense of security. They grew up in a world in which institutions failed, financial systems collapsed, and families were fragile. Children can now expect to have a lower quality of life than their parents, the pandemic rages, climate change looms, and social media is vicious. Their worldview is predicated on threat, not safety.
  • Thus the values of the Millennial and Gen Z generations that will dominate in the years ahead are the opposite of Boomer values: not liberation, but security; not freedom, but equality; not individualism, but the safety of the collective; not sink-or-swim meritocracy, but promotion on the basis of social justice
  • A new culture is dawning. The Age of Precarity is here.
  • I’ve spent my career rebutting the idea that America is in decline, but the events of these past six years, and especially of 2020, have made clear that we live in a broken nation. The cancer of distrust has spread to every vital organ.
  • Those were the days of triumphant globalization. Communism was falling. Apartheid was ending. The Arab-Israeli dispute was calming down. Europe was unifying. China was prospering. In the United States, a moderate Republican president, George H. W. Bush, gave way to the first Baby Boomer president, a moderate Democrat, Bill Clinton.
  • The stench of national decline is in the air. A political, social, and moral order is dissolving. America will only remain whole if we can build a new order in its place.
  • The American economy grew nicely. The racial wealth gap narrowed. All the great systems of society seemed to be working: capitalism, democracy, pluralism, diversity, globalization. It seemed, as Francis Fukuyama wrote in his famous “The End of History?” essay for The National Interest, “an unabashed victory for economic and political liberalism.”
  • Nations with low social trust—like Brazil, Morocco, and Zimbabwe—have struggling economies.
  • We think of the 1960s as the classic Boomer decade, but the false summer of the 1990s was the high-water mark of that ethos
  • The first great theme of that era was convergence. Walls were coming down. Everybody was coming together.
  • The second theme was the triumph of classical liberalism. Liberalism was not just a philosophy—it was a spirit and a zeitgeist, a faith that individual freedom would blossom in a loosely networked democratic capitalist world. Enterprise and creativity would be unleashed. America was the great embodiment and champion of this liberation.
  • The third theme was individualism. Society flourished when individuals were liberated from the shackles of society and the state, when they had the freedom to be true to themselves.
  • For his 2001 book, Moral Freedom, the political scientist Alan Wolfe interviewed a wide array of Americans. The moral culture he described was no longer based on mainline Protestantism, as it had been for generations
  • Instead, Americans, from urban bobos to suburban evangelicals, were living in a state of what he called moral freedom: the belief that life is best when each individual finds his or her own morality—inevitable in a society that insists on individual freedom.
  • moral freedom, like the other dominant values of the time, contained within it a core assumption: If everybody does their own thing, then everything will work out for everybody.
  • This was an ideology of maximum freedom and minimum sacrifice.
  • It all looks naive now. We were naive about what the globalized economy would do to the working class, naive to think the internet would bring us together, naive to think the global mixing of people would breed harmony, naive to think the privileged wouldn’t pull up the ladders of opportunity behind them
  • Over the 20 years after I sat with Kosieva, it all began to unravel. The global financial crisis had hit, the Middle East was being ripped apart by fanatics. On May 15, 2011, street revolts broke out in Spain, led by the self-declared Indignados—“the outraged.” “They don’t represent us!” they railed as an insult to the Spanish establishment. It would turn out to be the cry of a decade.
  • Millennials and members of Gen Z have grown up in the age of that disappointment, knowing nothing else. In the U.S. and elsewhere, this has produced a crisis of faith, across society but especially among the young. It has produced a crisis of trust.
  • Social trust is a generalized faith in the people of your community. It consists of smaller faiths. It begins with the assumption that we are interdependent, our destinies linked. It continues with the assumption that we share the same moral values. We share a sense of what is the right thing to do in different situations
  • gh-trust societies have what Fukuyama calls spontaneous sociability. People are able to organize more quickly, initiate action, and sacrifice for the common good.
  • When you look at research on social trust, you find all sorts of virtuous feedback loops. Trust produces good outcomes, which then produce more trust. In high-trust societies, corruption is lower and entrepreneurship is catalyzed.
  • Higher-trust nations have lower economic inequality, because people feel connected to each other and are willing to support a more generous welfare state.
  • People in high-trust societies are more civically engaged. Nations that score high in social trust—like the Netherlands, Sweden, China, and Australia—have rapidly growing or developed economies.
  • Renewal is hard to imagine. Destruction is everywhere, and construction difficult to see.
  • As the ethicist Sissela Bok once put it, “Whatever matters to human beings, trust is the atmosphere in which it thrives.”
  • During most of the 20th century, through depression and wars, Americans expressed high faith in their institutions
  • In 1964, for example, 77 percent of Americans said they trusted the federal government to do the right thing most or all of the time.
  • By 1994, only one in five Americans said they trusted government to do the right thing.
  • Then came the Iraq War and the financial crisis and the election of Donald Trump. Institutional trust levels remained pathetically low. What changed was the rise of a large group of people who were actively and poi
  • sonously alienated—who were not only distrustful but explosively distrustful. Explosive distrust is not just an absence of trust or a sense of detached alienation—it is an aggressive animosity and an urge to destroy. Explosive distrust is the belief that those who disagree with you are not just wrong but illegitimate
  • In 1997, 64 percent of Americans had a great or good deal of trust in the political competence of their fellow citizens; today only a third of Americans feel that way.
  • In most societies, interpersonal trust is stable over the decades. But for some—like Denmark, where about 75 percent say the people around them are trustworthy, and the Netherlands, where two-thirds say so—the numbers have actually risen.
  • In America, interpersonal trust is in catastrophic decline. In 2014, according to the General Social Survey conducted by NORC at the University of Chicago, only 30.3 percent of Americans agreed that “most people can be trusted,”
  • Today, a majority of Americans say they don’t trust other people when they first meet them.
  • There’s evidence to suggest that marital infidelity, academic cheating, and animal cruelty are all on the rise in America, but it’s hard to directly measure the overall moral condition of society—how honest people are, and how faithful.
  • Trust is the ratio between the number of people who betray you and the number of people who remain faithful to you. It’s not clear that there is more betrayal in America than there used to be—but there are certainly fewer faithful supports around people than there used to be.
  • Hundreds of books and studies on declining social capital and collapsing family structure demonstrate this. In the age of disappointment, people are less likely to be surrounded by faithful networks of people they can trust.
  • Black Americans have high trust in other Black Americans; it’s the wider society they don’t trust, for good and obvious reasons
  • As Vallier puts it, trust levels are a reflection of the moral condition of a nation at any given time.
  • high national trust is a collective moral achievement.
  • High national distrust is a sign that people have earned the right to be suspicious. Trust isn’t a virtue—it’s a measure of other people’s virtue.
  • Unsurprisingly, the groups with the lowest social trust in America are among the most marginalized.
  • Black Americans have been one of the most ill-treated groups in American history; their distrust is earned distrust
  • In 2018, 37.3 percent of white Americans felt that most people can be trusted, according to the General Social Survey, but only 15.3 percent of Black Americans felt the same.
  • People become trusting when the world around them is trustworthy. When they are surrounded by people who live up to their commitments. When they experience their country as a fair place.
  • In 2002, 43 percent of Black Americans were very or somewhat satisfied with the way Black people are treated in the U.S. By 2018, only 18 percent felt that way, according to Gallup.
  • The second disenfranchised low-trust group includes the lower-middle class and the working poor.
  • this group makes up about 40 percent of the country.
  • “They are driven by the insecurity of their place in society and in the economy,” he says. They are distrustful of technology and are much more likely to buy into conspiracy theories. “They’re often convinced by stories that someone is trying to trick them, that the world is against them,”
  • the third marginalized group that scores extremely high on social distrust: young adults. These are people who grew up in the age of disappointment. It’s the only world they know.
  • In 2012, 40 percent of Baby Boomers believed that most people can be trusted, as did 31 percent of members of Generation X. In contrast, only 19 percent of Millennials said most people can be trusted
  • Seventy-three percent of adults under 30 believe that “most of the time, people just look out for themselves,” according to a Pew survey from 2018. Seventy-one percent of those young adults say that most people “would try to take advantage of you if they got a chance.
  • A mere 10 percent of Gen Zers trust politicians to do the right thing.
  • Only 35 percent of young people, versus 67 percent of old people, believe that Americans respect the rights of people who are not like them.
  • Fewer than a third of Millennials say America is the greatest country in the world, compared to 64 percent of members of the Silent Generation.
  • “values and behavior are shaped by the degree to which survival is secure.” In the age of disappointment, our sense of safety went away
  • Some of this is physical insecurity: school shootings, terrorist attacks, police brutality, and overprotective parenting at home
  • the true insecurity is financial, social, and emotional.
  • By the time the Baby Boomers hit a median age of 35, their generation owned 21 percent of the nation’s wealth
  • First, financial insecurity
  • As of last year, Millennials—who will hit an average age of 35 in three years—owned just 3.2 percent of the nation’s wealth.
  • Next, emotional insecurity:
  • fewer children growing up in married two-parent households, more single-parent households, more depression, and higher suicide rates.
  • Then, identity insecurity.
  • All the traits that were once assigned to you by your community, you must now determine on your own: your identity, your morality, your gender, your vocation, your purpose, and the place of your belonging. Self-creation becomes a major anxiety-inducing act of young adulthood.
  • liquid modernity
  • Finally, social insecurity.
  • n the age of social media our “sociometers”—the antennae we use to measure how other people are seeing us—are up and on high alert all the time. Am I liked? Am I affirmed?
  • Danger is ever present. “For many people, it is impossible to think without simultaneously thinking about what other people would think about what you’re thinking,” the educator Fredrik deBoer has written. “This is exhausting and deeply unsatisfying. As long as your self-conception is tied up in your perception of other people’s conception of you, you will never be free to occupy a personality with confidence; you’re always at the mercy of the next person’s dim opinion of you and your whole deal.”
  • In this world, nothing seems safe; everything feels like chaos.
  • Distrust sows distrust. It produces the spiritual state that Emile Durkheim called anomie, a feeling of being disconnected from society, a feeling that the whole game is illegitimate, that you are invisible and not valued, a feeling that the only person you can really trust is yourself.
  • People plagued by distrust can start to see threats that aren’t there; they become risk averse
  • Americans take fewer risks and are much less entrepreneurial than they used to be. In 2014, the rate of business start-ups hit a nearly 40-year low. Since the early 1970s, the rate at which people move across state lines each year has dropped by 56 percent
  • People lose faith in experts. They lose faith in truth, in the flow of information that is the basis of modern society. “A world of truth is a world of trust, and vice versa,”
  • In periods of distrust, you get surges of populism; populism is the ideology of those who feel betrayed
  • People are drawn to leaders who use the language of menace and threat, who tell group-versus-group power narratives. You also get a lot more political extremism. People seek closed, rigid ideological systems that give them a sense of security.
  • fanaticism is a response to existential anxiety. When people feel naked and alone, they revert to tribe. Their radius of trust shrinks, and they only trust their own kind.
  • When many Americans see Trump’s distrust, they see a man who looks at the world as they do.
  • By February 2020, America was a land mired in distrust. Then the plague arrived.
  • From the start, the pandemic has hit the American mind with sledgehammer force. Anxiety and depression have spiked. In April, Gallup recorded a record drop in self-reported well-being, as the share of Americans who said they were thriving fell to the same low point as during the Great Recession
  • These kinds of drops tend to produce social upheavals. A similar drop was seen in Tunisian well-being just before the street protests that led to the Arab Spring.
  • The emotional crisis seems to have hit low-trust groups the hardest
  • “low trusters” were more nervous during the early months of the pandemic, more likely to have trouble sleeping, more likely to feel depressed, less likely to say the public authorities were responding well to the pandemic
  • Eighty-one percent of Americans under 30 reported feeling anxious, depressed, lonely, or hopeless at least one day in the previous week, compared to 48 percent of adults 60 and over.
  • Americans looked to their governing institutions to keep them safe. And nearly every one of their institutions betrayed them
  • The president downplayed the crisis, and his administration was a daily disaster area
  • The Centers for Disease Control and Prevention produced faulty tests, failed to provide up-to-date data on infections and deaths, and didn’t provide a trustworthy voice for a scared public.
  • The Food and Drug Administration wouldn’t allow private labs to produce their own tests without a lengthy approval process.
  • In nations that ranked high on the World Values Survey measure of interpersonal trust—like China, Australia, and most of the Nordic states—leaders were able to mobilize quickly, come up with a plan, and count on citizens to comply with the new rules.
  • In low-trust nations—like Mexico, Spain, and Brazil—there was less planning, less compliance, less collective action, and more death.
  • Countries that fell somewhere in the middle—including the U.S., Germany, and Japan—had a mixed record depending on the quality of their leadership.
  • South Korea, where more than 65 percent of people say they trust government when it comes to health care, was able to build a successful test-and-trace regime. In America, where only 31 percent of Republicans and 44 percent of Democrats say the government should be able to use cellphone data to track compliance with experts’ coronavirus social-contact guidelines, such a system was never really implemented.
  • For decades, researchers have been warning about institutional decay. Institutions get caught up in one of those negative feedback loops that are so common in a world of mistrust. They become ineffective and lose legitimacy. People who lose faith in them tend not to fund them. Talented people don’t go to work for them. They become more ineffective still.
  • On the right, this anti-institutional bias has manifested itself as hatred of government; an unwillingness to defer to expertise, authority, and basic science; and a reluctance to fund the civic infrastructure of society, such as a decent public health system
  • On the left, distrust of institutional authority has manifested as a series of checks on power that have given many small actors the power to stop common plans, producing what Fukuyama calls a vetocracy
  • In 2020, American institutions groaned and sputtered. Academics wrote up plan after plan and lobbed them onto the internet. Few of them went anywhere. America had lost the ability to build new civic structures to respond to ongoing crises like climate change, opioid addiction, and pandemics, or to reform existing ones.
  • In a lower-trust era like today, Levin told me, “there is a greater instinct to say, ‘They’re failing us.’ We see ourselves as outsiders to the systems—an outsider mentality that’s hard to get out of.”
  • Americans haven’t just lost faith in institutions; they’ve come to loathe them, even to think that they are evil
  • 55 percent of Americans believe that the coronavirus that causes COVID-19 was created in a lab and 59 percent believe that the U.S. government is concealing the true number of deaths
  • Half of all Fox News viewers believe that Bill Gates is plotting a mass-vaccination campaign so he can track people.
  • This spring, nearly a third of Americans were convinced that it was probably or definitely true that a vaccine existed but was being withheld by the government.
  • institutions like the law, the government, the police, and even the family don’t merely serve social functions, Levin said; they form the individuals who work and live within them. The institutions provide rules to live by, standards of excellence to live up to, social roles to fulfill.
  • By 2020, people had stopped seeing institutions as places they entered to be morally formed,
  • Instead, they see institutions as stages on which they can perform, can display their splendid selves.
  • People run for Congress not so they can legislate, but so they can get on TV. People work in companies so they can build their personal brand.
  • The result is a world in which institutions not only fail to serve their social function and keep us safe, they also fail to form trustworthy people. The rot in our structures spreads to a rot in ourselves.
  • The Failure of Society
  • The coronavirus has confronted America with a social dilemma. A social dilemma, the University of Pennsylvania scholar Cristina Bicchieri notes, is “a situation in which each group member gets a higher outcome if she pursues her individual self-interest, but everyone in the group is better off if all group members further the common interest.”
  • Social distancing is a social dilemma. Many low-risk individuals have been asked to endure some large pain (unemployment, bankruptcy) and some small inconvenience (mask wearing) for the sake of the common good. If they could make and keep this moral commitment to each other in the short term, the curve would be crushed, and in the long run we’d all be better off. It is the ultimate test of American trustworthiness.
  • While pretending to be rigorous, people relaxed and started going out. It was like watching somebody gradually give up on a diet. There wasn’t a big moment of capitulation, just an extra chocolate bar here, a bagel there, a scoop of ice cream before bed
  • in reality this was a mass moral failure of Republicans and Democrats and independents alike. This was a failure of social solidarity, a failure to look out for each other.
  • Alexis de Tocqueville discussed a concept called the social body. Americans were clearly individualistic, he observed, but they shared common ideas and common values, and could, when needed, produce common action. They could form a social body.
  • Over time, those common values eroded, and were replaced by a value system that put personal freedom above every other value
  • When Americans were confronted with the extremely hard task of locking down for months without any of the collective resources that would have made it easier—habits of deference to group needs; a dense network of community bonds to help hold each other accountable; a history of trust that if you do the right thing, others will too; preexisting patterns of cooperation; a sense of shame if you deviate from the group—they couldn’t do it. America failed.
  • The Crack-up
  • This wasn’t just a political and social crisis, it was also an emotional trauma.
  • The week before George Floyd was killed, the National Center for Health Statistics released data showing that a third of all Americans were showing signs of clinical anxiety or depression. By early June, after Floyd’s death, the percentage of Black Americans showing clinical signs of depression and anxiety disorders had jumped from 36 to 41 percent
  • By late June, American national pride was lower than at any time since Gallup started measuring, in 2001
  • In another poll, 71 percent of Americans said they were angry about the state of the country, and just 17 percent said they were proud.
  • By late June, it was clear that America was enduring a full-bore crisis of legitimacy, an epidemic of alienation, and a loss of faith in the existing order.
  • The most alienated, anarchic actors in society—antifa, the Proud Boys, QAnon—seemed to be driving events. The distrust doom loop was now at hand.
  • The Age of Precarity
  • Cultures are collective responses to common problems. But when reality changes, culture takes a few years, and a moral convulsion, to completely shake off the old norms and values.
  • The culture that is emerging, and which will dominate American life over the next decades, is a response to a prevailing sense of threat.
  • This new culture values security over liberation, equality over freedom, the collective over the individual.
  • From risk to security.
  • we’ve entered an age of precarity in which every political or social movement has an opportunity pole and a risk pole. In the opportunity mentality, risk is embraced because of the upside possibilities. In the risk mindset, security is embraced because people need protection from downside dangers
  • In this period of convulsion, almost every party and movement has moved from its opportunity pole to its risk pole.
  • From achievement to equality
  • In the new culture we are entering, that meritocratic system looks more and more like a ruthless sorting system that excludes the vast majority of people, rendering their life precarious and second class, while pushing the “winners” into a relentless go-go lifestyle that leaves them exhausted and unhappy
  • Equality becomes the great social and political goal. Any disparity—racial, economic, meritocratic—comes to seem hateful.
  • From self to society
  • If we’ve lived through an age of the isolated self, people in the emerging culture see embedded selves. Socialists see individuals embedded in their class group. Right-wing populists see individuals as embedded pieces of a national identity group. Left-wing critical theorists see individuals embedded in their racial, ethnic, gender, or sexual-orientation identity group.
  • The cultural mantra shifts from “Don’t label me!” to “My label is who I am.”
  • From global to local
  • When there is massive distrust of central institutions, people shift power to local institutions, where trust is higher. Power flows away from Washington to cities and states.
  • From liberalism to activism
  • enlightenment liberalism, which was a long effort to reduce the role of passions in politics and increase the role of reason. Politics was seen as a competition between partial truths.
  • Liberalism is ill-suited for an age of precarity. It demands that we live with a lot of ambiguity, which is hard when the atmosphere already feels unsafe. Furthermore, it is thin. It offers an open-ended process of discovery when what people hunger for is justice and moral certainty.
  • liberalism’s niceties come to seem like a cover that oppressors use to mask and maintain their systems of oppression. Public life isn’t an exchange of ideas; it’s a conflict of groups engaged in a vicious death struggle
  • The cultural shifts we are witnessing offer more safety to the individual at the cost of clannishness within society. People are embedded more in communities and groups, but in an age of distrust, groups look at each other warily, angrily, viciously.
  • The shift toward a more communal viewpoint is potentially a wonderful thing, but it leads to cold civil war unless there is a renaissance of trust. There’s no avoiding the core problem. Unless we can find a way to rebuild trust, the nation does not function.
  • How to Rebuild Trust
  • Historians have more to offer, because they can cite examples of nations that have gone from pervasive social decay to relative social health. The two most germane to our situation are Great Britain between 1830 and 1848 and the United States between 1895 and 1914.
  • In both periods, a highly individualistic and amoral culture was replaced by a more communal and moralistic one.
  • But there was a crucial difference between those eras and our own, at least so far. In both cases, moral convulsion led to frenetic action.
  • As Robert Putnam and Shaylyn Romney Garrett note in their forthcoming book, The Upswing, the American civic revival that began in the 1870s produced a stunning array of new organizations: the United Way, the NAACP, the Boy Scouts, the Forest Service, the Federal Reserve System, 4-H clubs, the Sierra Club, the settlement-house movement, the compulsory-education movement, the American Bar Association, the American Legion, the ACLU, and on and on
  • After the civic revivals, both nations witnessed frenetic political reform. During the 1830s, Britain passed the Reform Act, which widened the franchise; the Factory Act, which regulated workplaces; and the Municipal Corporations Act, which reformed local government.
  • The Progressive Era in America saw an avalanche of reform: civil-service reform; food and drug regulation; the Sherman Act, which battled the trusts; the secret ballot; and so on. Civic life became profoundly moralistic, but political life became profoundly pragmatic and anti-ideological. Pragmatism and social-science expertise were valued.
  • Can America in the 2020s turn itself around the way the America of the 1890s, or the Britain of the 1830s, did? Can we create a civic renaissance and a legislative revolution?
  • I see no scenario in which we return to being the nation we were in 1965, with a cohesive national ethos, a clear national establishment, trusted central institutions, and a pop-culture landscape in which people overwhelmingly watch the same shows and talked about the same things.
  • The age of distrust has smashed the converging America and the converging globe—that great dream of the 1990s—and has left us with the reality that our only plausible future is decentralized pluralism.
  • The key to making decentralized pluralism work still comes down to one question: Do we have the energy to build new organizations that address our problems, the way the Brits did in the 1830s and Americans did in the 1890s?
  • social trust is built within organizations in which people are bound together to do joint work, in which they struggle together long enough for trust to gradually develop, in which they develop shared understandings of what is expected of each other, in which they are enmeshed in rules and standards of behavior that keep them trustworthy when their commitments might otherwise falter.
  • Over the past 60 years, we have given up on the Rotary Club and the American Legion and other civic organizations and replaced them with Twitter and Instagram. Ultimately, our ability to rebuild trust depends on our ability to join and stick to organizations.
  • Whether we emerge from this transition stronger depends on our ability, from the bottom up and the top down, to build organizations targeted at our many problems. If history is any guide, this will be the work not of months, but of one or two decades.
  • For centuries, America was the greatest success story on earth, a nation of steady progress, dazzling achievement, and growing international power. That story threatens to end on our watch, crushed by the collapse of our institutions and the implosion of social trust
  • But trust can be rebuilt through the accumulation of small heroic acts—by the outrageous gesture of extending vulnerability in a world that is mean, by proffering faith in other people when that faith may not be returned. Sometimes trust blooms when somebody holds you against all logic, when you expected to be dropped.
  • By David Brooks
Javier E

Japanese Culture: 4th Edition (Updated and Expanded) (Kindle version) (Studies of the W... - 0 views

  • It is fitting that Japan’s earliest remaining works, composed at a time when the country was so strongly under the civilizing influence of China, should be of a historical character. In the Confucian tradition, the writing of history has always been held in the highest esteem, since Confucianists believe that the lessons of the past provide the best guide for ethical rule in the present and future. In contrast to the Indians, who have always been absorbed with metaphysical and religious speculation and scarcely at all with history, the Chinese are among the world’s greatest record-keepers.
  • he wrote that it is precisely because life and nature are changeable and uncertain that things have the power to move us.
  • The turbulent centuries of the medieval age produced many new cultural pursuits that catered to the tastes of various classes of society, including warriors, merchants, and even peasants. Yet, coloring nearly all these pursuits was miyabi, reflected in a fundamental preference on the part of the Japanese for the elegant, the restrained, and the subtly suggestive.
  • ...65 more annotations...
  • “Nothing in the West can compare with the role which aesthetics has played in Japanese life and history since the Heian period”; and “the miyabi spirit of refined sensibility is still very much in evidence” in modern aesthetic criticism.9
  • there has run through history the idea that the Japanese are, in terms of their original nature (that is, their nature before the introduction from the outside of such systems of thought and religion as Confucianism and Buddhism), essentially an emotional people. And in stressing the emotional side of human nature, the Japanese have always assigned high value to sincerity (makoto) as the ethic of the emotions.
  • If the life of the emotions thus had an ethic in makoto, the evolution of mono no aware in the Heian period provided it also with an aesthetic.
  • Tsurayuki said, in effect, that people are emotional entities and will intuitively and spontaneously respond in song and verse when they perceive things and are moved. The most basic sense of mono no aware is the capacity to be moved by things, whether they are the beauties of nature or the feelings of people,
  • One of the finest artistic achievements of the middle and late Heian period was the evolution of a native style of essentially secular painting that reached its apex in the narrative picture scrolls of the twelfth century. The products of this style of painting are called “Yamato [that is, Japanese] pictures” to distinguish them from works categorized as “Chinese pictures.”
  • The Fujiwara epoch, in literature as well as the visual arts, was soft, approachable, and “feminine.” By contrast, the earlier Jōgan epoch had been forbidding, secretive (esoteric), and “masculine.”
  • Despite the apparent lust of the samurai for armed combat and martial renown, much romanticized in later centuries, the underlying tone of the medieval age in Japan was from the beginning somber, pessimistic, and despairing. In The Tale of Genji the mood shifted from satisfaction with the perfections of Heian courtier society to uncertainty about this life and a craving for salvation in the next.
  • Despite political woes and territorial losses, the Sung was a time of great advancement in Chinese civilization. Some scholars, impressed by the extensive growth in cities, commerce, maritime trade, and governmental bureaucratization in the late T’ang and Sung, have even asserted that this was the age when China entered its “early modern” phase. The Sung was also a brilliant period culturally.
  • the fortuitous combination of desire on the part of the Sung to increase its foreign trade with Japan and the vigorous initiative taken in maritime activity by the Taira greatly speeded the process of transmission.
  • The Sung period in China, on the other hand, was an exceptional age for scholarship, most notably perhaps in history and in the compilation of encyclopedias and catalogs of art works. This scholarly activity was greatly facilitated by the development of printing, invented by the Chinese several centuries earlier.
  • In addition to reviving interest in Japanese poetry, the use of kana also made possible the evolution of a native prose literature.
  • peasantry, who formed the nucleus of what came to be known as the True Sect of Pure Land Buddhism. Through the centuries, this sect has attracted one of the largest followings among the Japanese, and its founder, Shinran, has been canonized as one of his country’s most original religious thinkers.
  • True genre art, picturing all classes at work and play, did not appear in Japan until the sixteenth century. The oldest extant genre painting of the sixteenth century is a work, dating from about 1525, called “Views Inside and Outside Kyoto” (rakuchū-rakugai zu).
  • the aesthetic principles that were largely to dictate the tastes of the medieval era. We have just remarked the use of sabi. Another major term of the new medieval aesthetics was yūgen, which can be translated as “mystery and depth.” Let
  • One of the basic values in the Japanese aesthetic tradition—along with such things as perishability, naturalness, and simplicity—is suggestion. The Japanese have from earliest times shown a distinct preference for the subtleties of suggestion, intimation, and nuance, and have characteristically sought to achieve artistic effect by means of “resonances” (yojō).
  • Amidism was not established as a separate sect until the time of the evangelist Hōnen (1133–1212).
  • But even in Chōmei we can observe a tendency to transform what is supposed to be a mean hovel into something of beauty based on an aesthetic taste for “deprivation” (to be discussed later in this chapter) that evolved during medieval times.
  • Apart from the proponents of Pure Land Buddhism, the person who most forcefully propagated the idea of universal salvation through faith was Nichiren (1222–82).
  • Nichiren held that ultimate religious truth lay solely in the Lotus Sutra, the basic text of the Greater Vehicle of Buddhism in which Gautama had revealed that all beings possess the potentiality for buddhahood.
  • At the time of its founding in Japan by Saichō in the early ninth century, the Tendai sect had been based primarily on the Lotus Sutra; but, in the intervening centuries, Tendai had deviated from the Sutra’s teachings and had even spawned new sects, like those of Pure Land Buddhism, that encouraged practices entirely at variance with these teachings.
  • Declaring himself “the pillar of Japan, the eye of the nation, and the vessel of the country,”14 Nichiren seems even to have equated himself with Japan and its fate.
  • The kōan is especially favored by what the Japanese call the Rinzai sect of Zen, which is also known as the school of “sudden enlightenment” because of its belief that satori, if it is attained, will come to the individual in an instantaneous flash of insight or awareness. The other major sect of Zen, Sōtō, rejects this idea of sudden enlightenment and instead holds that satori is a gradual process to be attained primarily through seated meditation.
  • Fought largely in Kyoto and its environs, the Ōnin War dragged on for more than ten years, and after the last armies withdrew in 1477 the once lovely capital lay in ruins. There was no clear-cut victor in the Ōnin War. The daimyos had simply fought themselves into exhaustion,
  • Yoshimasa was perhaps even more noteworthy as a patron of the arts than his grandfather, Yoshimitsu. In any case, his name is just as inseparably linked with the flourishing of culture in the Higashiyama epoch (usually taken to mean approximately the last half of the fifteenth century) as Yoshimitsu’s is with that of Kitayama.
  • The tea room, as a variant of the shoin room, evolved primarily in the sixteenth century.
  • Shukō’s admonition about taking care to “harmonize Japanese and Chinese tastes” has traditionally been taken to mean that he stood, in the late fifteenth century, at a point of transition from the elegant and “aristocratic” kind of Higashiyama chanoyu just described, which featured imported Chinese articles, to a new, Japanese form of the ceremony that used native ceramics,
  • the new kind of tea ceremony originated by Shukō is called wabicha, or “tea based on wabi.” Developed primarily by Shukō’s successors during the sixteenth century, wabicha is a subject for the next chapter.
  • The Japanese, on the other hand, have never dealt with nature in their art in the universalistic sense of trying to discern any grand order or structure; much less have they tried to associate the ideal of order in human society with the harmonies of nature. Rather,
  • The Chinese Sung-style master may have admired a mountain, for example, for its enduring, fixed quality, but the typical Japanese artist (of the fifteenth century or any other age) has been more interested in a mountain for its changing aspects:
  • Zen culture of Muromachi Japan was essentially a secular culture. This seems to be strong evidence, in fact, of the degree to which medieval Zen had become secularized: its view of nature was pantheistic and its concern with man was largely psychological.
  • Nobunaga’s castle at Azuchi and Hideyoshi’s at Momoyama have given their names to the cultural epoch of the age of unification. The designation of this epoch as Azuchi-Momoyama (or, for the sake of convenience, simply Momoyama) is quite appropriate in view of the significance of castles—as represented by these two historically famous structures—in the general progress, cultural and otherwise, of these exciting years.
  • Along with architecture, painting was the art that most fully captured the vigorous and expansive spirit of the Momoyama epoch of domestic culture during the age of unification. It was a time when many styles of painting and groups of painters flourished. Of the latter, by far the best known and most successful were the Kanō,
  • Motonobu also made free use of the colorful Yamato style of native art that had evolved during the Heian period and had reached its pinnacle in the great narrative picture scrolls of the twelfth and thirteenth centuries.
  • what screen painting really called for was color, and it was this that the Kanō artists, drawing on the native Yamato tradition, added to their work with great gusto during the Momoyama epoch. The color that these artists particularly favored was gold, and compositions done in ink and rich pigments on gold-leaf backgrounds became the most characteristic works of Momoyama art.
  • there could hardly be a more striking contrast between the spirits of two ages than the one reflected in the transition from the subdued monochromatic art of Japan’s medieval era to the blazing use of color by Momoyama artists, who stood on the threshold of early modern times.
  • aware, which, as we saw in Chapter 3, connotes the capacity to be moved by things. In the period of the Shinkokinshū, when Saigyō lived, this sentiment was particularly linked with the aesthetic of sabi or “loneliness” (and, by association, sadness). The human condition was essentially one of loneliness;
  • During the sixteenth century the ceremony was further developed as wabicha, or tea (cha) based on the aesthetic of wabi. Haga Kōshirō defines wabi as comprising three kinds of beauty: a simple, unpretentious beauty; an imperfect, irregular beauty; and an austere, stark beauty.
  • The alternate attendance system also had important consequences in the cultural realm, contributing to the development for the first time of a truly national culture. Thus, for example, the daimyos and their followers from throughout the country who regularly visited Edo were the disseminators of what became a national dialect or “lingua franca” and, ultimately, the standard language of modern Japan.
  • They also fostered the spread of customs, rules of etiquette, standards of taste, fashions, and the like that gave to Japanese everywhere a common lifestyle.
  • “[Tokugawa-period] statesmen thought highly of agriculture, but not of agriculturalists.”6 The life of the average peasant was one of much toil and little joy. Organized into villages that were largely self-governing, the peasants were obliged to render a substantial portion of their farming yields—on average, perhaps 50 percent or more—to the samurai, who provided few services in return. The resentment of peasants toward samurai grew steadily throughout the Tokugawa period and was manifested in countless peasant rebellions
  • Although in the long run the seclusion policy undeniably limited the economic growth of Tokugawa Japan by its severe restrictions both on foreign trade and on the inflow of technology from overseas, it also ensured a lasting peace that made possible a great upsurge in the domestic economy, especially during the first century of shogunate rule.
  • Both samurai and peasants were dependent almost solely on income from agriculture and constantly suffered declines in real income as the result of endemic inflation; only the townsmen, who as commercialists could adjust to price fluctuations, were in a position to profit significantly from the economic growth of the age.
  • We should not be surprised, therefore, to find this class giving rise to a lively and exuberant culture that reached its finest flowering in the Genroku epoch at the end of the seventeenth and the beginning of the eighteenth centuries. The mainstays of Genroku culture were the theatre, painting (chiefly in the form of the woodblock print), and prose fiction,
  • The Japanese had, of course, absorbed Confucian thinking from the earliest centuries of contact with China, but for more than a millennium Buddhism had drawn most of their intellectual attention. Not until the Tokugawa period did they come to study Confucianism with any great zeal.
  • One of the most conspicuous features of the transition from medieval to early modern times in Japan was the precipitous decline in the vigor of Buddhism and the rise of a secular spirit.
  • The military potential and much of the remaining landed wealth of the medieval Buddhist sects had been destroyed during the advance toward unification in the late sixteenth century. And although Buddhism remained very much part of the daily lives of the people, it not only ceased to hold appeal for many Japanese intellectuals but indeed even drew the outright scorn and enmity of some.
  • it was the Buddhist church—and especially the Zen sect—that paved the way for the upsurge in Confucian studies during Tokugawa times. Japanese Zen priests had from at least the fourteenth century on assiduously investigated the tenets of Sung Neo-Confucianism, and in ensuing centuries had produced a corpus of research upon which the Neo-Confucian scholarship of the Tokugawa period was ultimately built.
  • Yamaga Sokō is generally credited as the formulator of the code of bushidō, or the “way of the warrior.”4 Certainly he was a pioneer in analyzing the role of the samurai as a member of a true ruling elite and not simply as a rough, and frequently illiterate, participant in the endless civil struggles of the medieval age.
  • The fundamental purpose of Neo-Confucian practice is to calm one’s turbid ki to allow one’s nature (ri) to shine forth. The person who achieves this purpose becomes a sage, his ri seen as one with the universal principle, known as the “supreme ultimate” (taikyoku), that governs all things.
  • Neo-Confucianism proposed two main courses to clarify ri, one objective and the other subjective.7 The objective course was through the acquisition of knowledge by means of the “investigation of things,” a phrase taken by Chu Hsi from the Chinese classic The Great Learning (Ta hsüeh). At the heart of things to investigate was history,
  • Quite apart from any practical guidance to good rulership it may have provided, this Neo-Confucian stress on historical research proved to be a tremendous spur to scholarship and learning in general during the Tokugawa period;8 and, as we will see in the next chapter, it also facilitated the development of other, heterodox lines of intellectual inquiry.
  • the subjective course appeared to have been taken almost directly from Buddhism, and in particular Zen. It was the course of “preserving one’s heart by holding fast to seriousness,” which called for the clarification of ri by means remarkably similar to Zen meditation.
  • The calendrical era of Genro ku lasted from 1688 until 1703, but the Genroku cultural epoch is usually taken to mean the span of approximately a half-century from, say, 1675 until 1725. Setting the stage for this rise of a townsman-oriented culture was nearly a century of peace and steady commercial growth.
  • places of diversion and assignation, these quarters were the famous “floating worlds” (ukiyo) of Tokugawa fact and legend. Ukiyo, although used specifically from about this time to designate such demimondes, meant in the broadest sense the insubstantial and ever-changing existence in which man is enmeshed.
  • ukiyo15 always carried the connotation that life is fundamentally sad; but, in Genroku times, the term was more commonly taken to mean a world that was pleasurable precisely because it was constantly changing, exciting, and up-to-date.
  • the Tokugawa period was not at all like the humanism that emerged in the West from the Renaissance on. Whereas modern Western humanism became absorbed with people as individuals, with all their personal peculiarities, feelings, and ways, Japanese humanism of the Tokugawa period scarcely conceived of the existence of true individuals at all; rather, it focused on “the people” and regarded them as comprising essentially types, such as samurai, farmers, and courtesans.
  • there is little in the literature as a whole of that quality—character development—that is probably the single most important feature of the modern Western novel.
  • Although shogunate authorities and Tokugawa-period intellectuals in general had relatively little interest in the purely metaphysical side of Chu Hsi’s teachings, they found his philosophy to be enormously useful in justifying or ideologically legitimizing the feudal structure of state and society that had emerged in Japan by the seventeenth century.
  • With its radical advocacy of violent irrationality—to the point of psychosis—Hagakure has shocked many people. But during Japan’s militarist years of the 1930s and World War II, soldiers and others hailed it as something of a bible of samurai behavior, and the postwar nationalist writer Mishima Yukio was even inspired to write a book in praise of its values.
  • It is significant that many of the leading prose writers, poets, and critics of the most prominent journal of Japanese romanticism, Bungakukai (The Literary World, published from 1893 until 1898), were either converts to or strongly influenced by Protestant Christianity, the only creed in late Meiji Japan that gave primacy to the freedom and spiritual independence of the individual. The absolutism embodied in the Meiji Constitution demanded strict subordination of the interests of the individual to those of the state;
  • The feeling of frustration engendered by a society that placed such preponderant stress upon obedience to the group, especially in the form of filial piety toward one’s parents and loyalty to the state, no doubt accounts for much of the sense of alienation observable in the works of so many modern Japanese writers.
  • These writers have been absorbed to an unusual degree with the individual, the world of his personal psychology, and his essential loneliness. In line with this preoccupation, novelists have perennially turned to the diary-like, confessional tale—the so-called I-novel—as their preferred medium of expression.
  • In intellectual and emotional terms, the military came increasingly to be viewed as the highest repository of the traditional Japanese spirit that was the sole hope for unifying the nation to act in a time of dire emergency.
  • The enemy that had led the people astray was identified as those sociopolitical doctrines and ideologies that had been introduced to Japan from the West during the preceding half-century or so along with the material tools of modernization.
  • If there is a central theme to this book, it is that the Japanese, within the context of a history of abundant cultural borrowing from China in premodern times and the West in the modern age, have nevertheless retained a hard core of native social, ethical, and cultural values by means of which they have almost invariably molded and adapted foreign borrowing to suit their own tastes and purposes.
Javier E

Transcript: Ezra Klein Interviews Robinson Meyer - The New York Times - 0 views

  • Implementation matters, but it’s harder to cover because it’s happening in all parts of the country simultaneously. There isn’t a huge Republican-Democratic fight over it, so there isn’t the conflict that draws the attention to it
  • we sort of implicitly treat policy like it’s this binary one-zero condition. One, you pass a bill, and the thing is going to happen. Zero, you didn’t, and it won’t.
  • ROBINSON MEYER: You can almost divide the law up into different kind of sectors, right? You have the renewable build-out. You have EVs. You have carbon capture. You have all these other decarbonizing technologies the law is trying to encourage
  • ...184 more annotations...
  • that’s particularly true on the I.R.A., which has to build all these things in the real world.
  • we’re trying to do industrial physical transformation at a speed and scale unheralded in American history. This is bigger than anything we have done at this speed ever.
  • The money is beginning to move out the door now, but we’re on a clock. Climate change is not like some other issues where if you don’t solve it this year, it is exactly the same to solve it next year. This is an issue where every year you don’t solve it, the amount of greenhouse gases in the atmosphere builds, warming builds, the effects compound
  • Solve, frankly, isn’t the right word there because all we can do is abate, a lot of the problems now baked in. So how is it going, and who can actually walk us through that?
  • Robinson Meyer is the founding executive editor of heatmap.news
  • why do all these numbers differ so much? How big is this thing?
  • in electric vehicles and in the effort, kind of this dual effort in the law, to both encourage Americans to buy and use electric vehicles and then also to build a domestic manufacturing base for electric vehicles.
  • on both counts, the data’s really good on electric vehicles. And that’s where we’re getting the fastest response from industry and the clearest response from industry to the law.
  • ROBINSON MEYER: Factories are getting planned. Steel’s going in the ground. The financing for those factories is locked down. It seems like they’re definitely going to happen. They’re permitted. Companies are excited about them. Large Fortune 500 automakers are confidently and with certainty planning for an electric vehicle future, and they’re building the factories to do that in the United States. They’re also building the factories to do that not just in blue states. And so to some degree, we can see the political certainty for electric vehicles going forward.
  • in other parts of the law, partially due to just vagaries of how the law is being implemented, tax credits where the fine print hasn’t worked out yet, it’s too early to say whether the law is working and how it’s going and whether it’s going to accomplish its goal
  • EZRA KLEIN: I always find this very funny in a way. The Congressional Budget Office scored it. They thought it would make about $380 billion in climate investments over a decade. So then you have all these other analyses coming out.
  • But there’s actually this huge range of outcomes in between where the thing passes, and maybe what you wanted to have happen happens. Maybe it doesn’t. Implementation is where all this rubber meets the road
  • the Rhodium Group, which is a consulting firm, they think it could be as high as $522 billion, which is a big difference. Then there’s this Goldman Sachs estimate, which the administration loves, where they say they’re projecting $1.2 trillion in incentives —
  • ROBINSON MEYER: All the numbers differ because most of the important incentives, most of the important tax credits and subsidies in the I.R.A., are uncapped. There’s no limit to how much the government might spend on them. All that matters is that some private citizen or firm or organization come to the government and is like, hey, we did this. You said you’d give us money for it. Give us the money.
  • because of that, different banks have their own energy system models, their own models of the economy. Different research groups have their own models.
  • we know it’s going to be wrong because the Congressional Budget Office is actually quite constrained in how it can predict how these tax credits are taken up. And it’s constrained by the technology that’s out there in the country right now.
  • The C.B.O. can only look at the number of electrolyzers, kind of the existing hydrogen infrastructure in the country, and be like, well, they’re probably all going to use these tax credits. And so I think they said that there would be about $5 billion of take up for the hydrogen tax credits.
  • But sometimes money gets allocated, and then costs overrun, and there delays, and you can’t get the permits, and so on, and the thing never gets built
  • the fact that the estimates are going up is to them early evidence that this is going well. There is a lot of applications. People want the tax credits. They want to build these new factories, et cetera.
  • a huge fallacy that we make in policy all the time is assuming that once money is allocated for something, you get the thing you’re allocating the money for. Noah Smith, the economics writer, likes to call this checkism, that money equals stuff.
  • EZRA KLEIN: They do not want that, and not wanting that and putting every application through a level of scrutiny high enough to try and make sure you don’t have another one
  • I don’t think people think a lot about who is cutting these checks, but a lot of it is happening in this very obscure office of the Department of Energy, the Loan Program Office, which has gone from having $40 billion in lending authority, which is already a big boost over it not existing a couple decades ago, to $400 billion in loan authority,
  • the Loan Program Office as one of the best places we have data on how this is going right now and one of the offices that’s responded fastest to the I.R.A.
  • the Loan Program Office is basically the Department of Energy’s in-house bank, and it’s kind of the closest thing we have in the US to what exists in other countries, like Germany, which is a State development bank that funds projects that are eventually going to be profitable.
  • It has existed for some time. I mean, at first, it kind of was first to play after the Recovery Act of 2009. And in fact, early in its life, it gave a very important loan to Tesla. It gave this almost bridge loan to Tesla that helped Tesla build up manufacturing capacity, and it got Tesla to where it is today.
  • EZRA KLEIN: It’s because one of the questions I have about that office and that you see in some of the coverage of them is they’re very afraid of having another Solyndra.
  • Now, depending on other numbers, including the D.O.E., it’s potentially as high as $100 billion, but that’s because the whole thing about the I.R.A. is it’s meant to encourage the build-out of this hydrogen infrastructure.
  • EZRA KLEIN: I’m never that excited when I see a government loans program turning a profit because I think that tends to mean they’re not making risky enough loans. The point of the government should be to bear quite a bit of risk —
  • And to some degree, Ford now has to compete, and US automakers are trying to catch up with Chinese EV automakers. And its firms have EV battery technology especially, but just have kind of comprehensive understanding of the EV supply chain that no other countries’ companies have
  • ROBINSON MEYER: You’re absolutely right that this is the key question. They gave this $9.2 billion loan to Ford to build these EV battery plants in Kentucky and Tennessee. It’s the largest loan in the office’s history. It actually means that the investment in these factories is going to be entirely covered by the government, which is great for Ford and great for our build-out of EVs
  • And to some degree, I should say, one of the roles of L.P.O. and one of the roles of any kind of State development bank, right, is to loan to these big factory projects that, yes, may eventually be profitable, may, in fact, assuredly be profitable, but just aren’t there yet or need financing that the private market can’t provide. That being said, they have moved very slowly, I think.
  • And they feel like they’re moving quickly. They just got out new guidelines that are supposed to streamline a lot of this. Their core programs, they just redefined and streamlined in the name of speeding them up
  • However, so far, L.P.O. has been quite slow in getting out new loans
  • I want to say that the pressure they’re under is very real. Solyndra was a disaster for the Department of Energy. Whether that was fair or not fair, there’s a real fear that if you make a couple bad loans that go bad in a big way, you will destroy the political support for this program, and the money will be clawed back, a future Republican administration will wreck the office, whatever it might be. So this is not an easy call.
  • when you tell me they just made the biggest loan in their history to Ford, I’m not saying you shouldn’t lend any money to Ford, but when I think of what is the kind of company that cannot raise money on the capital markets, the one that comes to mind is not Ford
  • They have made loans to a number of more risky companies than Ford, but in addition to speed, do you think they are taking bets on the kinds of companies that need bets? It’s a little bit hard for me to believe that it would have been impossible for Ford to figure out how to finance factorie
  • ROBINSON MEYER: Now, I guess what I would say about that is that Ford is — let’s go back to why Solyndra failed, right? Solyndra failed because Chinese solar deluged the market. Now, why did Chinese solar deluge the market? Because there’s such support of Chinese financing from the state for massive solar factories and massive scale.
  • EZRA KLEIN: — the private market can’t. So that’s the meta question I’m asking here. In your view, because you’re tracking this much closer than I am, are they too much under the shadow of Solyndra? Are they being too cautious? Are they getting money out fast enough?
  • ROBINSON MEYER: I think that’s right; that basically, if we think the US should stay competitive and stay as close as it can and not even stay competitive, but catch up with Chinese companies, it is going to require large-scale state support of manufacturing.
  • EZRA KLEIN: OK, that’s fair. I will say, in general, there’s a constant thing you find reporting on government that people in government feel like they are moving very quickly
  • EZRA KLEIN: — given the procedural work they have to go through. And they often are moving very quickly compared to what has been done in that respect before, compared to what they have to get over. They are working weekends, they are working nights, and they are still not actually moving that quickly compared to what a VC firm can do or an investment bank or someone else who doesn’t have the weight of congressional oversight committees potentially calling you in and government procurement rules and all the rest of it.
  • ROBINSON MEYER: I think that’s a theme across the government’s implementation of the I.R.A. right now, is that generally the government feels like it’s moving as fast as it can. And if you look at the Department of Treasury, they feel like we are publishing — basically, the way that most of the I.R.A. subsidies work is that they will eventually be administered by the I.R.S., but first the Department of the Treasury has to write the guidebook for all these subsidies, right?
  • the law says there’s a very general kind of “here’s thousands of dollars for EVs under this circumstance.” Someone still has to go in and write all the fine print. The Department of Treasury is doing that right now for each tax credit, and they have to do that before anyone can claim that tax credit to the I.R.S. Treasury feels like it’s moving extremely quickly. It basically feels like it’s completely at capacity with these, and it’s sequenced these so it feels like it’s getting out the most important tax credits first.
  • Private industry feels like we need certainty. It’s almost a year since the law passed, and you haven’t gotten us the domestic content bonus. You haven’t gotten us the community solar bonus. You haven’t gotten us all these things yet.
  • a theme across the government right now is that the I.R.A. passed. Agencies have to write the regulations for all these tax credits. They feel like they’re moving very quickly, and yet companies feel like they’re not moving fast enough.
  • that’s how we get to this point where we’re 311 days out from the I.R.A. passing, and you’re like, well, has it made a big difference? And I’m like, well, frankly, wind and solar developers broadly don’t feel like they have the full understanding of all the subsidies they need yet to begin making the massive investments
  • I think it’s fair to say maybe the biggest bet on that is green hydrogen, if you’re looking in the bill.
  • We think it’s going to be an important tool in industry. It may be an important tool for storing energy in the power grid. It may be an important tool for anything that needs combustion.
  • ROBINSON MEYER: Yeah, absolutely. So green hydrogen — and let’s just actually talk about hydrogen broadly as this potential tool in the decarbonization tool kit.
  • It’s a molecule. It is a very light element, and you can burn it, but it’s not a fossil fuel. And a lot of the importance of hydrogen kind of comes back to that attribute of it.
  • So when we look at sectors of the economy that are going to be quite hard to decarbonize — and that’s because there is something about fossil fuels chemically that is essential to how that sector works either because they provide combustion heat and steelmaking or because fossil fuels are actually a chemical feedstock where the molecules in the fossil fuel are going into the product or because fossil fuels are so energy dense that you can carry a lot of energy while actually not carrying that much mass — any of those places, that’s where we look at hydrogen as going.
  • green hydrogen is something new, and the size of the bet is huge. So can you talk about first just what is green hydrogen? Because my understanding of it is spotty.
  • The I.R.A. is extremely generous — like extremely, extremely generous — in its hydrogen subsidies
  • The first is for what’s called blue hydrogen, which is hydrogen made from natural gas, where we then capture the carbon dioxide that was released from that process and pump it back into the ground. That’s one thing that’s subsidized. It’s basically subsidized as part of this broader set of packages targeted at carbon capture
  • green hydrogen, which is where we take water, use electrolyzers on it, basically zap it apart, take the hydrogen from the water, and then use that as a fue
  • The I.R.A. subsidies for green hydrogen specifically, which is the one with water and electricity, are so generous that relatively immediately, it’s going to have a negative cost to make green hydrogen. It will cost less than $0 to make green hydrogen. The government’s going to fully cover the cost of producing it.
  • That is intentional because what needs to happen now is that green hydrogen moves into places where we’re using natural gas, other places in the industrial economy, and it needs to be price competitive with those things, with natural gas, for instance. And so as it kind of is transported, it’s going to cost money
  • As you make the investment to replace the technology, it’s going to cost money. And so as the hydrogen moves through the system, it’s going to wind up being price competitive with natural gas, but the subsidies in the bill are so generous that hydrogen will cost less than $0 to make a kilogram of it
  • There seems to be a sense that hydrogen, green hydrogen, is something we sort of know how to make, but we don’t know how to make it cost competitive yet. We don’t know how to infuse it into all the processes that we need to be infused into. And so a place where the I.R.A. is trying to create a reality that does not yet exist is a reality where green hydrogen is widely used, we have to know how to use it, et cetera.
  • And they just seem to think we don’t. And so you need all these factories. You need all this innovation. Like, they have to create a whole innovation and supply chain almost from scratch. Is that right?
  • ROBINSON MEYER: That’s exactly right. There’s a great Department of Energy report that I would actually recommend anyone interested in this read called “The Liftoff Report for Clean Hydrogen.” They made it for a few other technologies. It’s a hundred-page book that’s basically how the D.O.E. believes we’re going to build out a clean hydrogen economy.
  • And, of course, that is policy in its own right because the D.O.E. is saying, here is the years we’re going to invest to have certain infrastructure come online. Here’s what we think we need. That’s kind of a signal to industry that everyone should plan around those years as well.
  • It’s a great book. It’s like the best piece of industrial policy I’ve actually seen from the government at all. But one of the points it makes is that you’re going to make green hydrogen. You’re then going to need to move it. You’re going to need to move it in a pipeline or maybe a truck or maybe in storage tanks that you then cart around.
  • Once it gets to a facility that uses green hydrogen, you’re going to need to store some green hydrogen there in storage tanks on site because you basically need kind of a backup supply in case your main supply fails. All of those things are going to add cost to hydrogen. And not only are they going to add cost, we don’t really know how to do them. We have very few pipelines that are hydrogen ready.
  • All of that investment needs to happen as a result to make the green hydrogen economy come alive. And why it’s so lavishly subsidized is to kind of fund all that downstream investment that’s eventually going to make the economy come true.
  • But a lot of what has to happen here, including once the money is given out, is that things we do know how to build get built, and they get built really fast, and they get built at this crazy scale.
  • So I’ve been reading this paper on what they call “The Greens’ Dilemma” by J.B. Ruhl and James Salzman, who also wrote this paper called “Old Green Laws, New Green Deal,” or something like that. And I think they get at the scale problem here really well.
  • “The largest solar facility currently online in the US is capable of generating 585 megawatts. To meet even a middle-road renewable energy scenario would require bringing online two new 400-megawatt solar power facilities, each taking up at least 2,000 acres of land every week for the next 30 years.”
  • And that’s just solar. We’re not talking wind there. We’re not talking any of the other stuff we’ve discussed here, transmission lines. Can we do that? Do we have that capacity?
  • ROBINSON MEYER: No, we do not. We absolutely do not. I think we’re going to build a ton of wind and solar. We do not right now have the system set up to use that much land to build that much new solar and wind by the time that we need to build it. I think it is partially because of permitting laws, and I think it’s also partially because right now there is no master plan
  • There’s no overarching strategic entity in the government that’s saying, how do we get from all these subsidies in the I.R.A. to net zero? What is our actual plan to get from where we are right now to where we’re emitting zero carbon as an economy? And without that function, no project is essential. No activity that we do absolutely needs to happen, and so therefore everything just kind of proceeds along at a convenient pace.
  • given the scale of what’s being attempted here, you might think that something the I.R.A. does is to have some entity in the government, as you’re saying, say, OK, we need this many solar farms. This is where we think we should put them. Let’s find some people to build them, or let’s build them ourselves.
  • what it actually does is there’s an office somewhere waiting for private companies to send in an application for a tax credit for solar that they say they’re going to build, and then we hope they build it
  • it’s an almost entirely passive process on the part of the government. Entirely would be going too far because I do think they talk to people, and they’re having conversations
  • the builder applies, not the government plans. Is that accurate?
  • ROBINSON MEYER: That’s correct. Yes.
  • ROBINSON MEYER: I think here’s what I would say, and this gets back to what do we want the I.R.A. to do and what are our expectations for the I.R.A
  • If the I.R.A. exists to build out a ton of green capacity and shift the political economy of the country toward being less dominated by fossil fuels and more dominated by the clean energy industry, frankly, then it is working
  • If the I.R.A. is meant to get us all the way to net zero, then it is not capable of that.
  • in 2022, right, we had no way to see how we were going to reduce emissions. We did not know if we were going to get a climate bill at all. Now, we have this really aggressive climate bill, and we’re like, oh, is this going to get us to net zero?
  • But getting to net zero was not even a possibility in 2022.
  • The issue is that the I.R.A. requires, ultimately, private actors to come forward and do these things. And as more and more renewables get onto the grid, almost mechanically, there’s going to be less interest in bringing the final pieces of decarbonized electricity infrastructure onto the grid as well.
  • EZRA KLEIN: Because the first things that get applied for are the ones that are more obviously profitable
  • The issue is when you talk to solar developers, they don’t see it like, “Am I going to make a ton of money, yes or no?” They see it like they have a capital stack, and they have certain incentives and certain ways to make money based off certain things they can do. And as more and more solar gets on the grid, building solar at all becomes less profitable
  • also, just generally, there’s less people willing to buy the solar.
  • as we get closer to a zero-carbon grid, there is this risk that basically less and less gets built because it will become less and less profitable
  • EZRA KLEIN: Let’s call that the last 20 percent risk
  • EZRA KLEIN: — or the last 40 percent. I mean, you can probably attach different numbers to that
  • ROBINSON MEYER: Permitting is the primary thing that is going to hold back any construction basically, especially out West,
  • right now permitting fights, the process under the National Environmental Policy Act just at the federal level, can take 4.5 years
  • let’s say every single project we need to do was applied for today, which is not true — those projects have not yet been applied for — they would be approved under the current permitting schedule in 2027.
  • ROBINSON MEYER: That’s before they get built.
  • Basically nobody on the left talked about permitting five years ago. I don’t want to say literally nobody, but you weren’t hearing it, including in the climate discussion.
  • people have moved to saying we do not have the laws, right, the permitting laws, the procurement laws to do this at the speed we’re promising, and we need to fix that. And then what you’re seeing them propose is kind of tweak oriented,
  • Permitting reform could mean a lot of different things, and Democrats and Republicans have different ideas about what it could mean. Environmental groups, within themselves, have different ideas about what it could mean.
  • for many environmental groups, the permitting process is their main tool. It is how they do the good that they see themselves doing in the world. They use the permitting process to slow down fossil fuel projects, to slow down projects that they see as harming local communities or the local environment.
  • ROBINSON MEYER: So we talk about the National Environmental Policy Act or NEPA. Let’s just start calling it NEPA. We talk about the NEPA process
  • NEPA requires the government basically study any environmental impact from a project or from a decision or from a big rule that could occur.
  • Any giant project in the United States goes through this NEPA process. The federal government studies what the environmental impact of the project will be. Then it makes a decision about whether to approve the project. That decision has nothing to do with the study. Now, notionally, the study is supposed to inform the project.
  • the decision the federal government makes, the actual “can you build this, yes or no,” legally has no connection to the study. But it must conduct the study in order to make that decision.
  • that permitting reform is so tough for the Democratic coalition specifically is that this process of forcing the government to amend its studies of the environmental impact of various decisions is the main tool that environmental litigation groups like Earthjustice use to slow down fossil fuel projects and use to slow down large-scale chemical or industrial projects that they don’t think should happen.
  • when we talk about making this program faster, and when we talk about making it more immune to litigation, they see it as we’re going to take away their main tools to fight fossil fuel infrastructure
  • why there’s this gap between rhetoric and what’s actually being proposed is that the same tool that is slowing down the green build-out is also what’s slowing down the fossil fuel build-out
  • ROBINSON MEYER: They’re the classic conflict here between the environmental movement classic, let’s call it, which was “think globally, act locally,” which said “we’re going to do everything we can to preserve the local environment,” and what the environmental movement and the climate movement, let’s say, needs to do today, which is think globally, act with an eye to what we need globally as well, which is, in some cases, maybe welcome projects that may slightly reduce local environmental quality or may seem to reduce local environmental quality in the name of a decarbonized world.
  • Because if we fill the atmosphere with carbon, nobody’s going to get a good environment.
  • Michael Gerrard, who is professor at Columbia Law School. He’s a founder of the Sabin Center for Climate Change Law there. It’s called “A Time for Triage,” and he has this sort of interesting argument that the environmental movement in general, in his view, is engaged in something he calls trade-off denial.
  • his view and the view of some people is that, look, the climate crisis is so bad that we just have to make those choices. We have to do things we would not have wanted to do to preserve something like the climate in which not just human civilization, but this sort of animal ecosystem, has emerged. But that’s hard, and who gets to decide which trade-offs to make?
  • what you’re not really seeing — not really, I would say, from the administration, even though they have some principles now; not really from California, though Gavin Newsom has a set of early things — is “this is what we think we need to make the I.R.A. happen on time, and this is how we’re going to decide what is a kind of project that gets this speedway through,” w
  • there’s a failure on the part of, let’s say, the environmental coalition writ large to have the courage to have this conversation and to sit down at a table and be like, “OK, we know that certain projects aren’t happening fast enough. We know that we need to build out faster. What could we actually do to the laws to be able to construct things faster and to meet our net-zero targets and to let the I.R.A. kind achieve what it could achieve?”
  • part of the issue is that we’re in this environment where Democrats control the Senate, Republicans control the House, and it feels very unlikely that you could just get “we are going to accelerate projects, but only those that are good for climate change,” into the law given that Republicans control the House.
  • part of the progressive fear here is that the right solutions must recognize climate change. Progressives are very skeptical that there are reforms that are neutral on the existence of climate change and whether we need to build faster to meet those demands that can pass through a Republican-controlled House.
  • one of the implications of that piece was it was maybe a huge mistake for progressives not to have figured out what they wanted here and could accept here, back when the negotiating partner was Joe Manchin.
  • Manchin’s bill is basically a set of moderate NEPA reforms and transmission reforms. Democrats, progressives refuse to move on it. Now, I do want to be fair here because I think Democrats absolutely should have seized on that opportunity, because it was the only moment when — we could tell already that Democrats — I mean, Democrats actually, by that moment, had lost the House.
  • I do want to be fair here that Manchin’s own account of what happened with this bill is that Senate Republicans killed it and that once McConnell failed to negotiate on the bill in December, Manchin’s bill was dead.
  • EZRA KLEIN: It died in both places.ROBINSON MEYER: It died in both places. I think that’s right.
  • Republicans already knew they were going to get the House, too, so they had less incentive to play along. Probably the time for this was October.
  • EZRA KLEIN: But it wasn’t like Democrats were trying to get this one done.
  • EZRA KLEIN: To your point about this was all coming down to the wire, Manchin could have let the I.R.A. pass many months before this, and they would have had more time to negotiate together, right? The fact that it was associated with Manchin in the way it was was also what made it toxic to progressives, who didn’t want to be held up by him anymore.
  • What becomes clear by the winter of this year, February, March of this year, is that as Democrats and Republicans begin to talk through this debt-ceiling process where, again, permitting was not the main focus. It was the federal budget. It was an entirely separate political process, basically.
  • EZRA KLEIN: I would say the core weirdness of the debt-ceiling fight was there was no main focus to it.
  • EZRA KLEIN: It wasn’t like past ones where it was about the debt. Republicans did some stuff to cut spending. They also wanted to cut spending on the I.R.S., which would increase the debt, right? It was a total mishmash of stuff happening in there.
  • That alchemy goes into the final debt-ceiling negotiations, which are between principals in Congress and the White House, and what we get is a set of basically the NEPA reforms in Joe Manchin’s bill from last year and the Mountain Valley pipeline, the thing that environmentalists were focused on blocking, and effectively no transmission reforms.
  • the set of NEPA reforms that were just enacted, that are now in the law, include — basically, the word reasonable has been inserted many times into NEPA. [LAUGHS] So the law, instead of saying the government has to study all environmental impacts, now it has to study reasonable environmental impacts.
  • this is a kind of climate win — has to study the environmental impacts that could result from not doing a project. The kind of average NEPA environmental impact study today is 500 pages and takes 4.5 years to produce. Under the law now, the government is supposed to hit a page limit of 150 to 300 pages.
  • there’s a study that’s very well cited by progressives from three professors in Utah who basically say, well, when you look at the National Forest Service, and you look at this 40,000 NEPA decisions, what mostly holds up these NEPA decisions is not like, oh, there’s too many requirements or they had to study too many things that don’t matter. It’s just there wasn’t enough staff and that staffing is primarily the big impediment. And so on the one hand, I think that’s probably accurate in that these are, in some cases — the beast has been starved, and these are very poorly staffed departments
  • The main progressive demand was just “we must staff it better.”
  • But if it’s taking you this much staffing and that much time to say something doesn’t apply to you, maybe you have a process problem —ROBINSON MEYER: Yes.EZRA KLEIN: — and you shouldn’t just throw endless resources at a broken process, which brings me — because, again, you can fall into this and never get out — I think, to the bigger critique her
  • these bills are almost symbolic because there’s so much else happening, and it’s really the way all this interlocks and the number of possible choke points, that if you touch one of them or even you streamline one of them, it doesn’t necessarily get you that f
  • “All told, over 60 federal permitting programs operate in the infrastructure approval regime, and that is just the federal system. State and local approvals and impact assessments could also apply to any project.”
  • their view is that under this system, it’s simply not possible to build the amount of decarbonization infrastructure we need at the pace we need it; that no amount of streamlining NEPA or streamlining, in California, CEQA will get you there; that we basically have been operating under what they call an environmental grand bargain dating back to the ’70s, where we built all of these processes to slow things down and to clean up the air and clean up the water.
  • we accepted this trade-off of slower building, quite a bit slower building, for a cleaner environment. And that was a good trade. It was addressing the problems of that era
  • now we have the problems of this era, which is we need to unbelievably, rapidly build out decarbonization infrastructure to keep the climate from warming more than we can handle and that we just don’t have a legal regime or anything.
  • You would need to do a whole new grand bargain for this era. And I’ve not seen that many people say that, but it seems true to me
  • the role that America had played in the global economy in the ’50s and ’60s where we had a ton of manufacturing, where we were kind of the factory to a world rebuilding from World War II, was no longer tenable and that, also, we wanted to focus on more of these kind of high-wage, what we would now call knowledge economy jobs.That was a large economic transition happening in the ’70s and ’80s, and it dovetailed really nicely with the environmental grand bargain.
  • At some point, the I.R.A. recognizes that that environmental grand bargain is no longer operative, right, because it says, we’re going to build all this big fiscal fixed infrastructure in the United States, we’re going to become a manufacturing giant again, but there has not been a recognition among either party of what exactly that will mean and what will be required to have it take hold.
  • It must require a form of on-the-ground, inside-the-fenceline, “at the site of the power plant” pollution control technology. The only way to do that, really, is by requiring carbon capture and requiring the large construction of major industrial infrastructure at many, many coal plants and natural gas plants around the country in order to capture carbon so it doesn’t enter the atmosphere, and so we don’t contribute to climate change. That is what the Supreme Court has ruled. Until that body changes, that is going to be the law.
  • So the E.P.A. has now, last month, proposed a new rule under the Clean Air Act that is going to require coal plants and some natural gas plants to install carbon capture technology to do basically what the Supreme Court has all but kind of required the E.P.A. to do
  • the E.P.A. has to demonstrate, in order to kind of make this rule the law and in order to make this rule pass muster with the Supreme Court, that this is tenable, that this is the best available and technologically feasible option
  • that means you actually have to allow carbon capture facilities to get built and you have to create a legal process that will allow carbon capture facilities to get built. And that means you need to be able to tell a power plant operator that if they capture carbon, there’s a way they can inject it back into the ground, the thing that they’re supposed to do with it.
  • Well, E.P.A. simultaneously has only approved the kind of well that you need to inject carbon that you’ve captured from a coal factory or a natural gas line back into the ground. It’s called a Class 6 well. The E.P.A. has only ever approved two Class 6 wells. It takes years for the E.P.A. to approve a Class 6 well.
  • And environmental justice groups really, really oppose these Class 6 wells because they see any carbon capture as an effort to extend the life of the fossil fuel infrastructure
  • The issue here is that it seems like C.C.S., carbon capture, is going to be essential to how the U.S. decarbonizes. Legally, we have no other choice because of the constraints the Supreme Court has placed on the E.P.A.. At the same time, environmental justice groups, and big green groups to some extent, oppose building out any C.C.S.
  • to be fair to them, right, they would say there are other ways to decarbonize. That may not be the way we’ve chosen because the politics weren’t there for it, but there are a lot of these groups that believe you could have 100 percent renewables, do not use all that much carbon capture, right? They would have liked to see a different decarbonization path taken too. I’m not sure that path is realistic.
  • what you do see are environmental groups opposing making it possible to build C.C.S. anywhere in the country at all.
  • EZRA KLEIN: The only point I’m making here is I think this is where you see a compromise a lot of them didn’t want to make —ROBINSON MEYER: Exactly, yeah.EZRA KLEIN: — which is a decarbonization strategy that actually does extend the life cycle of a lot of fossil fuel infrastructure using carbon capture. And because they never bought onto it, they’re still using the pathway they have to try to block it. The problem is that’s part of the path that’s now been chosen. So if you block it, you just don’t decarbonize. It’s not like you get the 100 percent renewable strategy.
  • ROBINSON MEYER: Exactly. The bargain that will emerge from that set of actions and that set of coalitional trade-offs is we will simply keep running this, and we will not cap it.
  • What could be possible is that progressives and Democrats and the E.P.A. turns around and says, “Oh, that’s fine. You can do C.C.S. You just have to cap every single stationary source in the country.” Like, “You want to do C.C.S.? We totally agree. Essential. You must put CSS infrastructure on every power plant, on every factory that burns fossil fuels, on everything.”
  • If progressives were to do that and were to get it into the law — and there’s nothing the Supreme Court has said, by the way, that would limit progressives from doing that — the upshot would be we shut down a ton more stationary sources and a ton more petrochemical refineries and these bad facilities that groups don’t want than we would under the current plan.
  • what is effectively going to happen is that way more factories and power plants stay open and uncapped than would be otherwise.
  • EZRA KLEIN: So Republican-controlled states are just on track to get a lot more of it. So the Rocky Mountain Institute estimates that red states will get $623 billion in investments by 2030 compared to $354 billion for blue states.
  • why are red states getting so much more of this money?
  • ROBINSON MEYER: I think there’s two reasons. I think, first of all, red states have been more enthusiastic about getting the money. They’re the ones giving away the tax credits. They have a business-friendly environment. And ultimately, the way many, many of these red-state governors see it is that these are just businesses.
  • I think the other thing is that these states, many of them, are right-to-work states. And so they might pay their workers less. They certainly face much less risk financially from a unionization campaign in their state.
  • regardless of the I.R.A., that’s where manufacturing and industrial investment goes in the first place. And that’s where it’s been going for 20 years because of the set of business-friendly and local subsidies and right-to-work policies.
  • I think the administration would say, we want this to be a big union-led effort. We want it to go to the Great Lakes states that are our political firewall.
  • and it would go to red states, because that’s where private industry has been locating since the ’70s and ’80s, and it would go to the Southeast, right, and the Sunbelt, and that that wouldn’t be so bad because then you would get a dynamic where red-state senators, red-state representatives, red-state governors would want to support the transition further and would certainly not support the repeal of the I.R.A. provisions and the repeal of climate provisions, and that you’d get this kind of nice vortex of the investment goes to red states, red states feel less antagonistic toward climate policies, more investment goes to red states. Red-state governors might even begin to support environmental regulation because that basically locks in benefits and advantages to the companies located in their states already.
  • I think what you see is that Republicans are increasingly warming to EV investment, and it’s actually building out renewables and actually building out clean electricity generation, where you see them fighting harder.
  • The other way that permitting matters — and this gets into the broader reason why private investment was generally going to red states and generally going to the Sunbelt — is that the Sunbelt states — Georgia, Texas — it’s easier to be there as a company because housing costs are lower and because the cost of living is lower in those states.
  • it’s also partially because the Sunbelt and the Southeast, it was like the last part of the country to develop, frankly, and there’s just a ton more land around all the cities, and so you can get away with the sprawling suburban growth model in those citie
  • It’s just cheaper to keep building suburbs there.
  • EZRA KLEIN: So how are you seeing the fights over these rare-earth metals and the effort to build a safe and, if not domestic, kind of friend-shored supply chain there?
  • Are we going to be able to source some of these minerals from the U.S.? That process seems to be proceeding but going slowly. There are some minerals we’re not going to be able to get from the United States at all and are going to have to get from our allies and partners across the world.
  • The kind of open question there is what exactly is the bargain we’re going to strike with countries that have these critical minerals, and will it be fair to those countries?
  • it isn’t to say that I think the I.R.A. on net is going to be bad for other countries. I just think we haven’t really figured out what deal and even what mechanisms we can use across the government to strike deals with other countries to mine the minerals in those countries while being fair and just and creating the kind of economic arrangement that those countries want.
  • , let’s say we get the minerals. Let’s say we learn how to refine them. There is many parts of the battery and many parts of EVs and many, many subcomponents in these green systems that there’s not as strong incentive to produce in the U.S.
  • at the same time, there’s a ton of technology. One answer to that might be to say, OK, well, what the federal government should do is just make it illegal for any of these battery makers or any of these EV companies to work with Chinese companies, so then we’ll definitely establish this parallel supply chain. We’ll learn how to make cathodes and anodes. We’ll figure it out
  • The issue is that there’s technology on the frontier that only Chinese companies have, and U.S. automakers need to work with those companies in order to be able to compete with them eventually.
  • EZRA KLEIN: How much easier would it be to achieve the I.R.A.’s goals if America’s relationship with China was more like its relationship with Germany?
  • ROBINSON MEYER: It would be significantly easier, and I think we’d view this entire challenge very differently, because China, as you said, not only is a leader in renewable energy. It actually made a lot of the important technological gains over the past 15 years to reducing the cost of solar and wind. It really did play a huge role on the supply side of reducing the cost of these technologies.
  • If we could approach that, if China were like Germany, if China were like Japan, and we could say, “Oh, this is great. China’s just going to make all these things. Our friend, China, is just going to make all these technologies, and we’re going to import them.
  • So it refines 75 percent of the polysilicon that you need for solar, but the machines that do the refining, 99 percent of them are made in China. I think it would be reckless for the U.S. to kind of rely on a single country and for the world to rely on a single country to produce the technologies that we need for decarbonization and unwise, regardless of our relationship with that country.
  • We want to geographically diversify the supply chain more, but it would be significantly easier if we did not have to also factor into this the possibility that the US is going to need to have an entirely separate supply chain to make use of for EVs, solar panels, wind turbines, batteries potentially in the near-term future.
  • , what are three other books they should read?
  • The first book is called “The End of the World” by Peter Brannen. It’s a book that’s a history of mass extinctions, the Earth’s five mass extinctions, and, actually, why he doesn’t think we’re currently in a mass extinction or why, at least, things would need to go just as bad as they are right now for thousands and thousands of years for us to be in basically the sixth extinction.
  • The book’s amazing for two reasons. The first is that it is the first that really got me to understand deep time.
  • he explains how one kind of triggered the next one. It is also an amazing book for understanding the centrality of carbon to Earth’s geological history going as far back as, basically, we can track.
  • “Climate Shock” by Gernot Wagner and Marty Weitzman. It’s about the economics of climate change
  • Marty Weitzman, who I think, until recently, was kind of the also-ran important economist of climate change. Nordhaus was the famous economist. He was the one who got all attention. He’s the one who won the Nobel.
  • He focuses on risk and that climate change is specifically bad because it will damage the environment, because it will make our lives worse, but it’s really specifically bad because we don’t know how bad it will be
  • it imposes all these huge, high end-tail risks and that blocking those tail risks is actually the main thing we want to do with climate policy.
  • That is I think, in some ways, what has become the U.S. approach to climate change and, to some degree, to the underlying economic thinking that drives even the I.R.A., where we want to just cut off these high-end mega warming scenarios. And this is a fantastic explanation of that particular way of thinking and of how to apply that way of thinking to climate change and also to geoengineerin
  • The third book, a little controversial, is called “Shorting the Grid” by Meredith Angwin
  • her argument is basically that electricity markets are not the right structure to organize our electricity system, and because we have chosen markets as a structured, organized electricity system in many states, we’re giving preferential treatment to natural gas and renewables, two fuels that I think climate activists may feel very different ways about, instead of coal, which she does think we should phase out, and, really, nuclear
  • By making it easier for renewables and natural gas to kind of accept these side payments, we made them much more profitable and therefore encouraged people to build more of them and therefore underinvested in the forms of generation, such as nuclear, that actually make most of their money by selling electrons to the grid, where they go to people’s homes.
Javier E

Is Holocaust Education Making Anti-Semitism Worse? - The Atlantic - 0 views

  • Explore
  • The recent rise in American anti-Semitism is well documented. I could fill pages with FBI hate-crime statistics, or with a list of violent attacks from the past six years or even the past six months, or with the growing gallery of American public figures saying vile things about Jews. Or I could share stories you probably haven’t heard, such as one about a threatened attack on a Jewish school in Ohio in March 2022—where the would-be perpetrator was the school’s own security guard. But none of that would capture the vague sense of dread one encounters these days in the Jewish community, a dread unprecedented in my lifetime.
  • What I didn’t expect was the torrent of private stories I received from American Jew
  • ...137 more annotations...
  • well-meaning people everywhere from statehouses to your local middle school have responded to this surging anti-Semitism by doubling down on Holocaust education. Before 2016, only seven states required Holocaust education in schools. In the past seven years, 18 more have passed Holocaust-education mandates
  • These casual stories sickened me in their volume and their similarity, a catalog of small degradations. At a time when many people in other minority groups have become bold in publicizing the tiniest of slights, these American Jews instead expressed deep shame in sharing these stories with me, feeling that they had no right to complain. After all, as many of them told me, it wasn’t the Holocaust.
  • These people talked about bosses and colleagues who repeatedly ridiculed them with anti-Semitic “jokes,” friends who turned on them when they mentioned a son’s bar mitzvah or a trip to Israel, romantic partners who openly mocked their traditions, classmates who defaced their dorm rooms and pilloried them online, teachers and neighbors who parroted conspiratorial lies. I was surprised to learn how many people were getting pennies thrown at them in 21st-century Americ
  • the blood libel, which would later be repurposed as a key part of the QAnon conspiracy theory. This craze wasn’t caused by one-party control over printing presses, but by the lie’s popularity
  • I have come to the disturbing conclusion that Holocaust education is incapable of addressing contemporary anti-Semitism. In fact, in the total absence of any education about Jews alive today, teaching about the Holocaust might even be making anti-Semitism worse.
  • The Illinois Holocaust Museum & Education Center is a victim of its own success. When I arrived on a weekday morning to join a field trip from a local Catholic middle school, the museum was having a light day, with only 160 students visiting
  • the docent established that the ’30s featured media beyond town criers, and that one-party control over such media helped spread propaganda. “If radio’s controlled by a certain party, you have to question that,” she said. “Back then, they didn’t.”
  • I wondered about that premise. Historians have pointed out that it doesn’t make sense to assume that people in previous eras were simply stupider than we are, and I doubted that 2020s Americans could outsmart 1930s Germans in detecting media bias. Propaganda has been used to incite violent anti-Semitism since ancient times, and only rarely because of one-party control.
  • The Nazi project was about murdering Jews, but also about erasing Jewish civilization. The museum’s valiant effort to teach students that Jews were “just like everyone else,” after Jews have spent 3,000 years deliberately not being like everyone else, felt like another erasur
  • I was starting to see how isolating the Holocaust from the rest of Jewish history made it hard for even the best educators to upload this irrational reality into seventh-grade brains.
  • the docent began by saying, “Let’s establish facts. Is Judaism a religion or a nationality?
  • My stomach sank. The question betrayed a fundamental misunderstanding of Jewish identity—Jews predate the concepts of both religion and nationality. Jews are members of a type of social group that was common in the ancient Near East but is uncommon in the West today: a joinable tribal group with a shared history, homeland, and culture, of which a nonuniversalizing religion is but one feature
  • Millions of Jews identify as secular, which would be illogical if Judaism were merely a religion. But every non-Jewish society has tried to force Jews into whatever identity boxes it knows best—which is itself a quiet act of domination.
  • “Religion, right,” the docent affirmed. (Later, in the gallery about Kristallnacht, she pointed out how Jews had been persecuted for having the “wrong religion,” which would have surprised the many Jewish converts to Christianity who wound up murdered. I know the docent knew this; she later told me she had abbreviated things to hustle our group to the museum’s boxcar.)
  • The docent motioned toward the prewar gallery’s photos showing Jewish school groups and family outings, and asked how the students would describe their subjects’ lives, based on the pictures.“Normal,” a girl said.“Normal, perfect,” the docent said. “They paid taxes, they fought in the wars—all of a sudden, things changed.”
  • the museum had made a conscious decision not to focus on the long history of anti-Semitism that preceded the Holocaust, and made it possible. To be fair, adequately covering this topic would have required an additional museum
  • The bedrock assumption that has endured for nearly half a century is that learning about the Holocaust inoculates people against anti-Semitism. But it doesn’t
  • Then there was the word normal. More than 80 percent of Jewish Holocaust victims spoke Yiddish, a 1,000-year-old European Jewish language spoken around the world, with its own schools, books, newspapers, theaters, political organizations, advertising, and film industry. On a continent where language was tightly tied to territory, this was hardly “normal.” Traditional Jewish practices—which include extremely detailed rules governing food and clothing and 100 gratitude blessings recited each day—were not “normal” either.
  • the idea of sudden change—referring to not merely the Nazi takeover, but the shift from a welcoming society to an unwelcoming one—was also reinforced by survivors in videos around the museum
  • Teaching children that one shouldn’t hate Jews, because Jews are “normal,” only underlines the problem: If someone doesn’t meet your version of “normal,” then it’s fine to hate them.
  • When I asked about worst practices in Holocaust education, Szany had many to share, which turned out to be widely agreed-upon among American Holocaust educators.
  • First on the list: “simulations.” Apparently some teachers need to be told not to make students role-play Nazis versus Jews in class, or not to put masking tape on the floor in the exact dimensions of a boxcar in order to cram 200 students into i
  • Szany also condemned Holocaust fiction such as the international best seller The Boy in the Striped Pajamas, an exceedingly popular work of ahistorical Christian-savior schlock
  • She didn’t feel that Anne Frank’s diary was a good choice either, because it’s “not a story of the Holocaust”—it offers little information about most Jews’ experiences of persecution, and ends before the author’s capture and murder.
  • Other officially failed techniques include showing students gruesome images, and prompting self-flattery by asking “What would you have done?
  • Yet another bad idea is counting objects. This was the conceit of a widely viewed 2004 documentary called Paper Clips, in which non-Jewish Tennessee schoolchildren, struggling to grasp the magnitude of 6 million murdered Jews, represented those Jews by collecting millions of paper clips
  • it is demeaning to represent Jewish people as office supplies.
  • Best practices, Szany explained, are the opposite: focusing on individual stories, hearing from survivors and victims in their own words. The Illinois museum tries to “rescue the individuals from the violence,
  • In the language I often encountered in Holocaust-education resources, people who lived through the Holocaust were neatly categorized as “perpetrators,” “victims,” “bystanders,” or “upstanders.” Jewish resisters, though, were rarely classified as “upstanders.
  • I felt as I often had with actual Holocaust survivors I’d known when I was younger: frustrated as they answered questions I hadn’t asked, and vaguely insulted as they treated me like an annoyance to be managed. (I bridged this divide once I learned Yiddish in my 20s, and came to share with them a vast vocabulary of not only words, but people, places, stories, ideas—a way of thinking and being that contained not a few horrific years but centuries of hard-won vitality and resilience
  • Szany at last explained to me what the dead Elster couldn’t: The woman who sheltered his sister took only girls because it was too easy for people to confirm that the boys were Jews.
  • I realized that I wouldn’t have wanted to hear this answer from Elster. I did not want to make this thoughtful man sit onstage and discuss his own circumcision with an audience of non-Jewish teenagers. The idea felt just as dehumanizing as pulling down a boy’s pants to reveal a reality of embodied Judaism that, both here and in that barn, had been drained of any meaning beyond persecution
  • Here I am in a boxcar, I thought, and tried to make it feel real. I spun my head to take in the immersive scene, which swung around me as though I were on a rocking ship. I felt dizzy and disoriented, purely physical feelings that distracted me. Did this not count as a simulation
  • I had visited Auschwitz in actual reality, years ago. With my headset on, I tried to summon the emotional intensity I remembered feeling then. But I couldn’t, because all of the things that had made it powerful were missing. When I was there, I was touching things, smelling things, sifting soil between my fingers that the guide said contained human bone ash, feeling comforted as I recited the mourner’s prayer, the kaddish, with others, the ancient words an undertow of paradox and praise: May the great Name be blessed, forever and ever and ever
  • Students at the Skokie museum can visit an area called the Take a Stand Center, which opens with a bright display of modern and contemporary “upstanders,” including activists such as the Nobel laureate Malala Yousafzai and the athlete Carli Lloyd. Szany had told me that educators “wanted more resources” to connect “the history of the Holocaust to lessons of today.” (I heard this again and again elsewhere too.) As far as I could discern, almost nobody in this gallery was Jewish.
  • As Szany ran a private demo of the technology for me, I asked how visitors react to it. “They’re more comfortable with the holograms than the real survivors,” Szany said. “Because they know they won’t be judged.”
  • t the post-Holocaust activists featured in this gallery were nearly all people who had stood up for their own group. Only Jews, the unspoken assumption went, were not supposed to stand up for themselves.
  • Visitors were asked to “take the pledge” by posting notes on a wall (“I pledge to protect the Earth!” “I pledge to be KIND!”)
  • It was all so earnest that for the first time since entering the museum, I felt something like hope. Then I noticed it: “Steps for Organizing a Demonstration.” The Nazis in Skokie, like their predecessors, had known how to organize a demonstration. They hadn’t been afraid to be unpopular. They’d taken a stand.
  • I left the museum haunted by the uncomfortable truth that the structures of a democratic society could not really prevent, and could even empower, dangerous, irrational rage. Something of that rage haunted me too.
  • the more I thought about it, the less obvious it seemed. What were students being taught to “take a stand” for? How could anyone, especially young people with little sense of proportion, connect the murder of 6 million Jews to today without landing in a swamp of Holocaust trivialization, like the COVID-protocol protesters who’d pinned Jewish stars to their shirt and carried posters of Anne Frank?
  • weren’t they and others like them doing exactly what Holocaust educators claimed they wanted people to do?
  • The 2019 law was inspired by a changing reality in Washington and around the country. In recent years, Kennedy said, she’s received more and more messages about anti-Semitic vandalism and harassment in schools. For example, she told me, “someone calls and says, ‘There’s a swastika drawn in the bathroom.’ ”
  • Maybe not, Kennedy admitted. “What frightens me is that small acts of anti-Semitism are becoming very normalized,” she said. “We’re getting used to it. That keeps me up at night.”“Sadly, I don’t think we can fix this,” Regelbrugge said. “But we’re gonna die trying.”
  • Almost every city where I spoke with Holocaust-museum educators, whether by phone or in person, had also been the site of a violent anti-Semitic attack in the years since these museums had opened
  • I was struck by how minimally these attacks were discussed in the educational materials shared by the museums.
  • In fact, with the exception of Kennedy and Regelbrugge, no one I spoke with mentioned these anti-Semitic attacks at all.
  • The failure to address contemporary anti-Semitism in most of American Holocaust education is, in a sense, by design
  • the story of the (mostly non-Jewish) teachers in Massachusetts and New Jersey who created the country’s first Holocaust curricula, in the ’70s. The point was to teach morality in a secular society. “Everyone in education, regardless of ethnicity, could agree that Nazism was evil and that the Jews were innocent victims,” Fallace wrote, explaining the topic’s appeal. “Thus, teachers used the Holocaust to activate the moral reasoning of their students”—to teach them to be good people.
  • The idea that Holocaust education can somehow serve as a stand-in for public moral education has not left us. And because of its obviously laudable goals, objecting to it feels like clubbing a baby seal. Who wouldn’t want to teach kids to be empathetic?
  • by this logic, shouldn’t Holocaust education, because of its moral content alone, automatically inoculate people against anti-Semitism?
  • Apparently not. “Essentially the moral lessons that the Holocaust is often used to teach reflect much the same values that were being taught in schools before the Holocaust,”
  • (Germans in the ’30s, after all, were familiar with the Torah’s commandment, repeated in the Christian Bible, to love their neighbors.) This fact undermines nearly everything Holocaust education is trying to accomplish, and reveals the roots of its failure.
  • One problem with using the Holocaust as a morality play is exactly its appeal: It flatters everyone. We can all congratulate ourselves for not committing mass murder.
  • This approach excuses current anti-Semitism by defining anti-Semitism as genocide in the past
  • When anti-Semitism is reduced to the Holocaust, anything short of murdering 6 million Jews—like, say, ramming somebody with a shopping cart, or taunting kids at school, or shooting up a Jewish nonprofit, or hounding Jews out of entire countries—seems minor by comparison.
  • If we teach that the Holocaust happened because people weren’t nice enough—that they failed to appreciate that humans are all the same, for instance, or to build a just society—we create the self-congratulatory space where anti-Semitism grow
  • One can believe that humans are all the same while being virulently anti-Semitic, because according to anti-Semites, Jews, with their millennia-old insistence on being different from their neighbors, are the obstacle to humans all being the same
  • One can believe in creating a just society while being virulently anti-Semitic, because according to anti-Semites, Jews, with their imagined power and privilege, are the obstacle to a just society
  • To inoculate people against the myth that humans have to erase their differences in order to get along, and the related myth that Jews, because they have refused to erase their differences, are supervillains, one would have to acknowledge that these myths exist
  • To really shatter them, one would have to actually explain the content of Jewish identity, instead of lazily claiming that Jews are just like everyone else.
  • one of several major Holocaust-curriculum providers, told me about the “terrible Jew jokes” she’d heard from her own students in Virginia. “They don’t necessarily know where they come from or even really why they’re saying them,” Goss said. “Many kids understand not to say the N-word, but they would say, ‘Don’t be such a Jew.’ ”
  • There’s a decline in history education at the same time that there’s a rise in social media,”
  • “We’ve done studies with our partners at Holocaust centers that show that students are coming in with questions about whether the Holocaust was an actual event. That wasn’t true 20 years ago.”
  • Goss believes that one of the reasons for the lack of stigma around anti-Semitic conspiracy theories and jokes is baked into the universal-morality approach to Holocaust education. “The Holocaust is not a good way to teach about ‘bullying,’ 
  • Echoes & Reflections’ lesson plans do address newer versions of anti-Semitism, including the contemporary demonization of Israel’s existence—as opposed to criticism of Israeli policies—and its manifestation in aggression against Jews. Other Holocaust-curriculum providers also have material on contemporary anti-Semitism.
  • providers rarely explain or explore who Jews are today—and their raison d’être remains Holocaust education.
  • Many teachers had told me that their classrooms “come alive” when they teach about the Holocaust
  • Holocaust-education materials are just plain better than those on most other historical topics. All of the major Holocaust-education providers offer lessons that teachers can easily adapt for different grade levels and subject areas. Instead of lecturing and memorization, they use participation-based methods such as group work, hands-on activities, and “learner driven” projects.
  • A 2019 Pew Research Center survey found a correlation between “warm” feelings about Jews and knowledge about the Holocaust—but the respondents who said they knew a Jewish person also tended to be more knowledgeable about the Holocaust, providing a more obvious source for their feelings
  • In 2020, Echoes & Reflections published a commissioned study of 1,500 college students, comparing students who had been exposed to Holocaust education in high school with those who hadn’t. The published summary shows that those who had studied the Holocaust were more likely to tolerate diverse viewpoints, and more likely to privately support victims of bullying scenarios, which is undoubtedly good news. It did not, however, show a significant difference in respondents’ willingness to defend victims publicly, and students who’d received Holocaust education were less likely to be civically engaged—in other words, to be an “upstander.”
  • These studies puzzled me. As Goss told me, the Holocaust was not about bullying—so why was the Echoes study measuring that? More important, why were none of these studies examining awareness of anti-Semitism, whether past or present?
  • One major study addressing this topic was conducted in England, where a national Holocaust-education mandate has been in place for more than 20 years. In 2016, researchers at University College London’s Centre for Holocaust Education published a survey of more than 8,000 English secondary-school students, including 244 whom they interviewed at length.
  • The study’s most disturbing finding was that even among those who studied the Holocaust, there was “a very common struggle among many students to credibly explain why Jews were targeted” in the Holocaust—that is, to cite anti-Semitism
  • “many students appeared to regard [Jews’] existence as problematic and a key cause of Nazi victimisation.” In other words, students blamed the Holocaust on the Jews
  • This result resembles that of a large 2020 survey of American Millennials and Gen Zers, in which 11 percent of respondents believed that Jews caused the Holocaust. The state with the highest percentage of respondents believing this—an eye-popping 19 percent—was New York, which has mandated Holocaust education since the 1990s.
  • Worse, in the English study, “a significant number of students appeared to tacitly accept some of the egregious claims once circulated by Nazi propaganda,” instead of recognizing them as anti-Semitic myths.
  • One typical student told researchers, “Is it because like they were kind of rich, so maybe they thought that that was kind of in some way evil, like the money didn’t belong to them[;] it belonged to the Germans and the Jewish people had kind of taken that away from them?
  • Another was even more blunt: “The Germans, when they saw the Jews were better off than them, kind of, I don’t know, it kind of pissed them off a bit.” Hitler’s speeches were more eloquent in making similar points.
  • One of the teachers I met was Benjamin Vollmer, a veteran conference participant who has spent years building his school’s Holocaust-education program. He teaches eighth-grade English in Venus, Texas, a rural community with 5,700 residents; his school is majority Hispanic, and most students qualify for free or reduced-price lunch. When I asked him why he focuses on the Holocaust, his initial answer was simple: “It meets the TEKS.”
  • The TEKS are the Texas Essential Knowledge and Skills, an elaborate list of state educational requirements that drive standardized testing
  • it became apparent that Holocaust education was something much bigger for his students: a rare access point to a wider world. Venus is about 30 miles from Dallas, but Vollmer’s annual Holocaust-museum field trip is the first time that many of his students ever leave their town.
  • “It’s become part of the school culture,” Vollmer said. “In eighth grade, they walk in, and the first thing they ask is, ‘When are we going to learn about the Holocaust?’
  • Vollmer is not Jewish—and, as is common for Holocaust educators, he has never had a Jewish student. (Jews are 2.4 percent of the U.S. adult population, according to a 2020 Pew survey.) Why not focus on something more relevant to his students, I asked him, like the history of immigration or the civil-rights movement?
  • I hadn’t yet appreciated that the absence of Jews was precisely the appeal.“Some topics have been so politicized that it’s too hard to teach them,” Vollmer told me. “Making it more historical takes away some of the barriers to talking about it.”
  • Wouldn’t the civil-rights movement, I asked, be just as historical for his students?He paused, thinking it through. “You have to build a level of rapport in your class before you have the trust to explore your own history,” he finally said.
  • “The Holocaust happened long ago, and we’re not responsible for it,” she said. “Anything happening in our world today, the wool comes down over our eyes.” Her colleague attending the conference with her, a high-school teacher who also wouldn’t share her name, had tried to take her mostly Hispanic students to a virtual-reality experience called Carne y Arena, which follows migrants attempting to illegally cross the U.S.-Mexico border. Her administrators refused, claiming that it would traumatize students. But they still learn about the Holocaust.
  • Student discomfort has been a legal issue in Texas. The state’s House Bill 3979, passed in 2021, is one of many “anti-critical-race-theory” laws that conservative state legislators have introduced since 2020. The bill forbade teachers from causing students “discomfort, guilt, anguish, or any other form of psychological distress on account of the individual’s race or sex,” and also demanded that teachers introduce “diverse and contending perspectives” when teaching “controversial” topics, “without giving deference to any one perspective.
  • These vaguely worded laws stand awkwardly beside a 2019 state law mandating Holocaust education for Texas students at all grade levels during an annual Holocaust Remembrance Week
  • the administrator who’d made the viral remarks in Southlake is a strong proponent of Holocaust education, but was acknowledging a reality in that school district. Every year, the administrator had told Higgins, some parents in her district object to their children reading the Nobel laureate Elie Wiesel’s memoir Night—because it isn’t their “belief” that the Holocaust happened.
  • In one model lesson at the conference, participants examined a speech by the Nazi official Heinrich Himmler about the need to murder Jews, alongside a speech by the Hebrew poet and ghetto fighter Abba Kovner encouraging a ghetto uprising. I only later realized that this lesson plan quite elegantly satisfied the House bill’s requirement of providing “contending perspectives.”
  • The next day, I asked the instructor if that was an unspoken goal of her lesson plan. With visible hesitation, she said that teaching in Texas can be like “walking the tightrope.” This way, she added, “you’re basing your perspectives on primary texts and not debating with Holocaust deniers.” Less than an hour later, a senior museum employee pulled me aside to tell me that I wasn’t allowed to interview the staff.
  • Many of the visiting educators at the conference declined to talk with me, even anonymously; nearly all who did spoke guardedly. The teachers I met, most of whom were white Christian women, did not seem to be of any uniform political bent. But virtually all of them were frustrated by what administrators and parents were demanding of them.
  • Two local middle-school teachers told me that many parents insist on seeing reading lists. Parents “wanting to keep their kid in a bubble,” one of them said, has been “the huge stumbling block.”
  • “It is healthy to begin this study by talking about anti-Semitism, humanizing the victims, sticking to primary sources, and remaining as neutral as possible.”
  • Wasn’t “remaining as neutral as possible” exactly the opposite of being an upstander?
  • In trying to remain neutral, some teachers seemed to want to seek out the Holocaust’s bright side—and ask dead Jews about i
  • We watched a brief introduction about Glauben’s childhood and early adolescence in the Warsaw Ghetto and in numerous camps. When the dead man appeared, one teacher asked, “Was there any joy or happiness in this ordeal? Moments of joy in the camps?”
  • These experiences, hardly unusual for Jewish victims, were not the work of a faceless killing machine. Instead they reveal a gleeful and imaginative sadism. For perpetrators, this was fun. Asking this dead man about “joy” seemed like a fundamental misunderstanding of the Holocaust. There was plenty of joy, just on the Nazi side.
  • In the educational resources I explored, I did not encounter any discussions of sadism—the joy derived from humiliating people, the dopamine hit from landing a laugh at someone else’s expense, the self-righteous high from blaming one’s problems on others—even though this, rather than the fragility of democracy or the passivity of bystanders, is a major origin point of all anti-Semitism
  • To anyone who has spent 10 seconds online, that sadism is familiar, and its source is familiar too: the fear of being small, and the desire to feel big by making others feel small instead.
  • Nazis were, among other things, edgelords, in it for the laughs. So, for that matter, were the rest of history’s anti-Semites, then and now. For Americans today, isn’t this the most relevant insight of all?
  • “People say we’ve learned from the Holocaust. No, we didn’t learn a damn thing,”
  • “People glom on to this idea of the upstander,” she said. “Kids walk away with the sense that there were a lot of upstanders, and they think, Yes, I can do it too.”
  • The problem with presenting the less inspiring reality, she suggested, is how parents or administrators might react. “If you teach historical anti-Semitism, you have to teach contemporary anti-Semitism. A lot of teachers are fearful, because if you try to connect it to today, parents are going to call, or administrators are going to call, and say you’re pushing an agenda.”
  • But weren’t teachers supposed to “push an agenda” to stop hatred? Wasn’t that the entire hope of those survivors who built museums and lobbied for mandates and turned themselves into holograms?
  • I asked Klett why no one seemed to be teaching anything about Jewish culture. If the whole point of Holocaust education is to “humanize” those who were “dehumanized,” why do most teachers introduce students to Jews only when Jews are headed for a mass grave? “There’s a real fear of teaching about Judaism,” she confided. “Especially if the teacher is Jewish.”
  • Teachers who taught about industrialized mass murder were scared of teaching about … Judaism? Why?
  • “Because the teachers are afraid that the parents are going to say that they’re pushing their religion on the kids.”
  • “Survivors have told me, ‘Thank you for teaching this. They’ll listen to you because you’re not Jewish,’ ” she said. “Which is weird.”
  • perhaps we could be honest and just say “There is no point in teaching any of this”—because anti-Semitism is so ingrained in our world that even when discussing the murders of 6 million Jews, it would be “pushing an agenda” to tell people not to hate them, or to tell anyone what it actually means to be Jewish
  • The Dallas Museum was the only one I visited that opened with an explanation of who Jews are. Its exhibition began with brief videos about Abraham and Moses—limiting Jewish identity to a “religion” familiar to non-Jews, but it was better than nothing. The museum also debunked the false charge that the Jews—rather than the Romans—killed Jesus, and explained the Jews’ refusal to convert to other faiths. It even had a panel or two about contemporary Dallas Jewish life. Even so, a docent there told me that one question students ask is “Are any Jews still alive today?”
  • American Holocaust education, in this museum and nearly everywhere else, never ends with Jews alive today. Instead it ends by segueing to other genocides, or to other minorities’ suffering
  • But when one reaches the end of the exhibition on American slavery at the National Museum of African American History and Culture, in Washington, D.C., one does not then enter an exhibition highlighting the enslavement of other groups throughout world history, or a room full of interactive touchscreens about human trafficking today, asking that visitors become “upstanders” in fighting i
  • That approach would be an insult to Black history, ignoring Black people’s current experiences while turning their past oppression into nothing but a symbol for something else, something that actually matters.
  • It is dehumanizing to be treated as a symbol. It is even more dehumanizing to be treated as a warning.
  • How should we teach children about anti-Semitism?
  • Decoster began her conference workshop by introducing “vocabulary must-knows.” At the top of her list: anti-Semitism.
  • “If you don’t explain the ism,” she cautioned the teachers in the room, “you will need to explain to the kids ‘Why the Jews?’ Students are going to see Nazis as aliens who bring with them anti-Semitism when they come to power in ’33, and they take it back away at the end of the Holocaust in 1945.”
  • She asked the teachers, “What’s the first example of the persecution of the Jews in history?”
  • “Think ancient Egypt,” Decoster said. “Does this sound familiar to any of you?”“They’re enslaved by the Egyptian pharaoh,” a teacher said
  • I wasn’t sure that the biblical Exodus narrative exactly qualified as “history,” but it quickly became clear that wasn’t Decoster’s point. “Why does the pharaoh pick on the Jews?” she asked. “Because they had one God.”
  • I was stunned. Rarely in my journey through American Holocaust education did I hear anyone mention a Jewish belief.
  • “The Jews worship one God, and that’s their moral structure. Egyptian society has multiple gods whose authority goes to the pharaoh. When things go wrong, you can see how Jews as outsiders were perceived by the pharaoh as the threat.”
  • This unexpected understanding of Jewish belief revealed a profound insight about Judaism: Its rejection of idolatry is identical to its rejection of tyranny. I could see how that might make people uncomfortable.
  • Decoster moved on to a snazzy infographic of a wheel divided in thirds, each explaining a component of anti-Semitism
  • “Racial Antisemitism = False belief that Jews are a race and a threat to other races,”
  • Anti-Judaism = Hatred of Jews as a religious group,”
  • then “Anti-Jewish Conspiracy Theory = False belief that Jews want to control and overtake the world.” The third part, the conspiracy theory, was what distinguished anti-Semitism from other bigotries. It allowed closed-minded people to congratulate themselves for being open-minded—for “doing their own research,” for “punching up,” for “speaking truth to power,” while actually just spreading lies.
  • Wolfson clarified for his audience what this centuries-long demonization of Jews actually means, citing the scholar David Patterson, who has written: “In the end, the antisemite’s claim is not that all Jews are evil, but rather that all evil is Jewish.”
  • Wolfson told the teachers that it was important that “anti-Semitism should not be your students’ first introduction to Jews and Judaism.” He said this almost as an aside, just before presenting the pig-excrement image. “If you’re teaching about anti-Semitism before you teach about the content of Jewish identity, you’re doing it wrong.
  • this—introducing students to Judaism by way of anti-Semitism—was exactly what they were doing. The same could be said, I realized, for nearly all of American Holocaust education.
  • The Holocaust educators I met across America were all obsessed with building empathy, a quality that relies on finding commonalities between ourselves and others.
  • a more effective way to address anti-Semitism might lie in cultivating a completely different quality, one that happens to be the key to education itself: curiosity. Why use Jews as a means to teach people that we’re all the same, when the demand that Jews be just like their neighbors is exactly what embedded the mental virus of anti-Semitism in the Western mind in the first place? Why not instead encourage inquiry about the diversity, to borrow a de rigueur word, of the human experience?
  • I want a hologram of the late Rabbi Jonathan Sacks telling people about what he called “the dignity of difference.”
  • I want to mandate this for every student in this fractured and siloed America, even if it makes them much, much more uncomfortable than seeing piles of dead Jews doe
  • There is no empathy without curiosity, no respect without knowledge, no other way to learn what Jews first taught the world: love your neighbor
Javier E

The Aspiring Novelist Who Became Obama's Foreign-Policy Guru - The New York Times - 0 views

  • Standing in his front office before the State of the Union, Rhodes quickly does the political math on the breaking Iran story. “Now they’ll show scary pictures of people praying to the supreme leader,” he predicts, looking at the screen. Three beats more, and his brain has spun a story line to stanch the bleeding. He turns to Price. “We’re resolving this, because we have relationships,” he says.
  • Price turns to his computer and begins tapping away at the administration’s well-cultivated network of officials, talking heads, columnists and newspaper reporters, web jockeys and outside advocates who can tweet at critics and tweak their stories backed up by quotations from “senior White House officials” and “spokespeople.” I watch the message bounce from Rhodes’s brain to Price’s keyboard to the three big briefing podiums — the White House, the State Department and the Pentagon — and across the Twitterverse, where it springs to life in dozens of insta-stories, which over the next five hours don formal dress for mainstream outlets. It’s a tutorial in the making of a digital news microclimate — a storm that is easy to mistake these days for a fact of nature, but whose author is sitting next to me right now.
  • Watching Rhodes work, I remember that he is still, chiefly, a writer, who is using a new set of tools — along with the traditional arts of narrative and spin — to create stories of great consequence on the biggest page imaginable. The narratives he frames, the voices of senior officials, the columnists and reporters whose work he skillfully shapes and ventriloquizes, and even the president’s own speeches and talking points, are the only dots of color in a much larger vision about who Americans are and where we are going
  • ...56 more annotations...
  • When I asked Jon Favreau, Obama’s lead speechwriter in the 2008 campaign, and a close friend of Rhodes’s, whether he or Rhodes or the president had ever thought of their individual speeches and bits of policy making as part of some larger restructuring of the American narrative, he replied, “We saw that as our entire job.”
  • I realize during our conversations that the role Rhodes plays in the White House bears less resemblance to any specific character on Beltway-insider TV shows like “The West Wing” or “House of Cards” than it does to the people who create those shows
  • “I love Don DeLillo,” I answer.“Yeah,” Rhodes answers. “That’s the only person I can think of who has confronted these questions of, you know, the individual who finds himself negotiating both vast currents of history and a very specific kind of power dynamics. That’s his milieu. And that’s what it’s like to work in the U.S. foreign-policy apparatus in 2016.” Advertisement Continue reading the main story
  • “I immediately understood that it’s a very important quality for a staffer,” Hamilton explained, “that he could come into a meeting and decide what was decided.” I suggested that the phrase “decide what was decided” is suggestive of the enormous power that might accrue to someone with Rhodes’s gifts. Hamilton nodded. “Absolutely,” he said.
  • Rhodes’s opinions were helpful in shaping the group’s conclusions — a scathing indictment of the policy makers responsible for invading Iraq. For Rhodes, who wrote much of the I.S.G. report, the Iraq war was proof, in black and white, not of the complexity of international affairs or the many perils attendant on political decision-making but of the fact that the decision-makers were morons.
  • when Rhodes joined the Obama campaign in 2007, he arguably knew more about the Iraq war than the candidate himself, or any of his advisers. He had also developed a healthy contempt for the American foreign-policy establishment, including editors and reporters at The New York Times, The Washington Post, The New Yorker and elsewhere, who at first applauded the Iraq war and then sought to pin all the blame on Bush and his merry band of neocons when it quickly turned sour
  • The job he was hired to do, namely to help the president of the United States communicate with the public, was changing in equally significant ways, thanks to the impact of digital technologie
  • Obama relies on Rhodes for “an unvarnished take,” in part, she says, because “Ben just has no poker face,” and so it’s easy to see when he is feeling uncomfortable. “The president will be like, ‘Ben, something on your mind?’ And then Ben will have this incredibly precise lay-down of why the previous half-hour has been an utter waste of time, because there’s a structural flaw to the entire direction of the conversation.”
  • The literary character that Rhodes most closely resembles, Power volunteers, is Holden Caulfield. “He hates the idea of being phony, and he’s impetuous, and he has very strong views.”
  • He became aware of two things at once: the weight of the issues that the president was confronted with, and the intense global interest in even the most mundane presidential communications.
  • It is hard for many to absorb the true magnitude of the change in the news business — 40 percent of newspaper-industry professionals have lost their jobs over the past decade — in part because readers can absorb all the news they want from social-media platforms like Facebook, which are valued in the tens and hundreds of billions of dollars and pay nothing for the “content” they provide to their readers
  • As she explained how the process worked, I was struck by how naïve the assumption of a “state of nature” must seem in an information environment that is mediated less and less by experienced editors and reporters with any real prior knowledge of the subjects they write about. “People construct their own sense of source and credibility now,” she said. “They elect who they’re going to believe.
  • “All these newspapers used to have foreign bureaus,” he said. “Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.”
  • ”This is something different from old-fashioned spin, which tended to be an art best practiced in person. In a world where experienced reporters competed for scoops and where carrying water for the White House was a cause for shame, no matter which party was in power, it was much harder to sustain a “narrative” over any serious period of time
  • Now the most effectively weaponized 140-character idea or quote will almost always carry the day, and it is very difficult for even good reporters to necessarily know where the spin is coming from or why
  • , I brought up the soft Orwellian vibe of an information space where old media structures and hierarchies have been erased by Silicon Valley billionaires who convinced the suckers that information was “free” and everyone with access to Google was now a reporter
  • Axelrod, a former newspaperman, sighed. “It’s not as easy as standing in front of a press conference and speaking to 70 million people like past presidents have been able to do,” he said. The bully pulpit by and large doesn’t exist anymore, he explained. “So more and more, over the last couple of years, there’s been an investment in alternative means of communication: using digital more effectively, going to nontraditional sources, understanding where on each issue your constituencies are going to be found,” he said. “I think they’ve approached these major foreign-policy challenges as campaign challenges, and they’ve run campaigns, and those campaigns have been very sophisticated.
  • Rhodes’s innovative campaign to sell the Iran deal is likely to be a model for how future administrations explain foreign policy to Congress and the publi
  • The way in which most Americans have heard the story of the Iran deal presented — that the Obama administration began seriously engaging with Iranian officials in 2013 in order to take advantage of a new political reality in Iran, which came about because of elections that brought moderates to power in that country — was largely manufactured for the purpose for selling the deal. Even where the particulars of that story are true, the implications that readers and viewers are encouraged to take away from those particulars are often misleading or false
  • Obama’s closest advisers always understood him to be eager to do a deal with Iran as far back as 2012, and even since the beginning of his presidency. “It’s the center of the arc,” Rhodes explained to me two days after the deal, officially known as the Joint Comprehensive Plan of Action, was implemented. He then checked off the ways in which the administration’s foreign-policy aims and priorities converged on Iran. “We don’t have to kind of be in cycles of conflict if we can find other ways to resolve these issues,” he said. “We can do things that challenge the conventional thinking that, you know, ‘AIPAC doesn’t like this,’ or ‘the Israeli government doesn’t like this,’ or ‘the gulf countries don’t like it.’ It’s the possibility of improved relations with adversaries. It’s nonproliferation. So all these threads that the president’s been spinning — and I mean that not in the press sense — for almost a decade, they kind of all converged around Iran.”
  • The idea that there was a new reality in Iran was politically useful to the Obama administration. By obtaining broad public currency for the thought that there was a significant split in the regime, and that the administration was reaching out to moderate-minded Iranians who wanted peaceful relations with their neighbors and with America, Obama was able to evade what might have otherwise been a divisive but clarifying debate over the actual policy choices that his administration was making
  • By eliminating the fuss about Iran’s nuclear program, the administration hoped to eliminate a source of structural tension between the two countries, which would create the space for America to disentangle itself from its established system of alliances with countries like Saudi Arabia, Egypt, Israel and Turkey. With one bold move, the administration would effectively begin the process of a large-scale disengagement from the Middle East.
  • Rhodes “was kind of like the quarterback,” running the daily video conferences and coming up with lines of attack and parry. “He was extremely good about immediately getting to a phrase or a way of getting the message out that just made more sense,” Kreikemeier remembers. Framing the deal as a choice between peace and war was Rhodes’s go-to move — and proved to be a winning argument.
  • we developed a plan that was like: The Iran deal is literally going to be the tip of everything that we stand up online,” Somanader says. “And we’re going to map it onto what we know about the different audiences we’re dealing with: the public, pundits, experts, the right wing, Congress.” By applying 21st-century data and networking tools to the white-glove world of foreign affairs, the White House was able to track what United States senators and the people who worked for them, and influenced them, were seeing online — and make sure that no potential negative comment passed without a tweet.
  • If anything, that anger has grown fiercer during Rhodes’s time in the White House. He referred to the American foreign-policy establishment as the Blob. According to Rhodes, the Blob includes Hillary Clinton, Robert Gates and other Iraq-war promoters from both parties who now whine incessantly about the collapse of the American security order in Europe and the Middle East.
  • During the course of the Iran talks, Malley told me, he always kept in close contact with Rhodes. “I would often just call him and say, ‘Give me a reality check,’ ” Malley explained. “He could say, ‘Here is where I think the president is, and here is where I think he will be.’ ” He continued, “Ben would try to anticipate: Does it make sense policywise? But then he would also ask himself: How do we sell it to Congress? How do we sell it to the public? What is it going to do to our narrative?”
  • “The Iran experience was the place where I saw firsthand how policy, politics and messaging all had to be brought together, and I think that Ben is really at the intersection of all three,” Malley says. “He reflects and he shapes at the same time.
  • Rhodes’s war room did its work on Capitol Hill and with reporters. In the spring of last year, legions of arms-control experts began popping up at think tanks and on social media, and then became key sources for hundreds of often-clueless reporters. “We created an echo chamber,” he admitted, when I asked him to explain the onslaught of freshly minted experts cheerleading for the deal. “They were saying things that validated what we had given them to say.
  • When I suggested that all this dark metafictional play seemed a bit removed from rational debate over America’s future role in the world, Rhodes nodded. “In the absence of rational discourse, we are going to discourse the [expletive] out of this
  • “We had test drives to know who was going to be able to carry our message effectively, and how to use outside groups like Ploughshares, the Iran Project and whomever else. So we knew the tactics that worked.” He is proud of the way he sold the Iran deal. “We drove them crazy,” he said of the deal’s opponents.
  • Rhodes’s passion seems to derive not from any investment in the technical specifics of sanctions or centrifuge arrays, or any particular optimism about the future course of Iranian politics and society. Those are matters for the negotiators and area specialists. Rather, it derived from his own sense of the urgency of radically reorienting American policy in the Middle East in order to make the prospect of American involvement in the region’s future wars a lot less likely
  • When I asked whether the prospect of this same kind of far-reaching spin campaign being run by a different administration is something that scares him, he admitted that it does. “I mean, I’d prefer a sober, reasoned public debate, after which members of Congress reflect and take a vote,” he said, shrugging. “But that’s impossible.”
  • Obama’s particular revulsion against a certain kind of global power politics is a product, Rhodes suggests, of his having been raised in Southeast Asia. “Indonesia was a place where your interaction at that time with power was very intimate, right?” Rhodes asks. “Tens or hundreds of thousands of people had just been killed. Power was not some abstract thing,” he muses. “When we sit in Washington and debate foreign policy, it’s like a Risk game, or it’s all about us, or the human beings disappear from the decisions. But he lived in a place where he was surrounded by people who had either perpetrated those acts — and by the way, may not have felt great about that — or else knew someone who was a victim. I don’t think there’s ever been an American president who had an experience like that at a young age of what power is.
  • The parts of Obama’s foreign policy that disturb some of his friends on the left, like drone strikes, Rhodes says, are a result of Obama’s particular kind of globalism, which understands the hard and at times absolute necessity of killing. Yet, at the same time, they are also ways of avoiding more deadly uses of force — a kind of low-body-count spin move
  • He shows me the president’s copy of his Nobel Peace Prize acceptance speech, a revision of an original draft by Favreau and Rhodes whose defining tension was accepting a prize awarded before he had actually accomplished anything. In his longhand notes, Obama relocated the speech’s tension in the fact that he was accepting a peace prize a week after ordering 30,000 more troops to Afghanistan. King and Gandhi were the author’s heroes, yet he couldn’t act as they did, because he runs a state. The reason that the author had to exercise power was because not everyone in the world is rational.
  • In Panetta’s telling, his own experience at the Pentagon under Obama sometimes resembled being installed in the driver’s seat of a car and finding that the steering wheel and brakes had been disconnected from the engine. Obama and his aides used political elders like him, Robert Gates and Hillary Clinton as cover to end the Iraq war, and then decided to steer their own course, he suggests. While Panetta pointedly never mentions Rhodes’s name, it is clear whom he is talking about.
  • “Was it a point of connection between you and the president that you had each spent some substantial part of your childhoods living in another country?” I ask. Her face lights up.
  • “Absolutely,” she answers. The question is important to her. “The first conversation we had over dinner, when we first met, was about what it was like for both of us to live in countries that were predominantly Muslim countries at formative parts of our childhood and the perspective it gave us about the United States and how uniquely excellent it is,” she says. “We talked about what it was like to be children, and how we played with children who had totally different backgrounds than our own but you would find something in common.”
  • Barack Obama is not a standard-issue liberal Democrat. He openly shares Rhodes’s contempt for the groupthink of the American foreign-policy establishment and its hangers-on in the press. Yet one problem with the new script that Obama and Rhodes have written is that the Blob may have finally caught on
  • “He is a brilliant guy, but he has a real problem with what I call the assignment of bad faith,” one former senior official told me of the president. “He regards everyone on the other side at this point as being a bunch of bloodthirsty know-nothings from a different era who play by the old book
  • Another official I spoke to put the same point more succinctly: “Clearly the world has disappointed him.
  • When I asked whether he believed that the Oval Office debate over Syria policy in 2012 — resulting in a decision not to support the uprising against Assad in any meaningful way — had been an honest and open one, he said that he had believed that it was, but has since changed his mind. “Instead of adjusting his policies to the reality, and adjusting his perception of reality to the changing realities on the ground, the conclusions he draws are exactly the same, no matter what the costs have been to our strategic interests,”
  • “In an odd way, he reminds me of Bush.” The comparison is a startling one — and yet, questions of tone aside, it is uncomfortably easy to see the similarities between the two men, American presidents who projected their own ideas of the good onto an indifferent world.
  • He understands the president’s pivot toward Iran as the logical result of a deeply held premise about the negative effects of use of American military force on a scale much larger than drone strikes or Special Forces raids. “I think the whole legacy that he was working on was, ‘I’m the guy who’s going to bring these wars to an end, and the last goddamn thing I need is to start another war,’ ” he explains of Obama. “If you ratchet up sanctions, it could cause a war. If you start opposing their interest in Syria, well, that could start a war, too.”
  • I examine the president’s thoughts unfolding on the page, and the lawyerly, abstract nature of his writing process. “Moral imagination, spheres of identity, but also move beyond cheap lazy pronouncements,” one note reads. Here was the new American self — rational, moral, not self-indulgent. No longer one thing but multiple overlapping spheres or circles. Who is described here? As usual, the author is describing himself.
  • “There were staff people who put themselves in a position where they kind of assumed where the president’s head was on a particular issue, and they thought their job was not to go through this open process of having people present all these different options, but to try to force the process to where they thought the president wanted to be,” he says. “They’d say, ‘Well, this is where we want you to come out.’ And I’d say ‘[expletive], that’s not the way it works. We’ll present a plan, and then the president can make a decision
  • Perhaps the president and his aides were continually unable to predict the consequences of their actions in Syria, and made mistake after mistake, while imagining that it was going to come out right the next time
  • “Another read, which isn’t necessarily opposed to that,” I continue, “is that their actual picture is entirely coherent. But if they put it in blunt, unnuanced terms — ”Panetta completes my sentence: “ — they’d get the [expletive] kicked out of them.” He looks at me curiously. “Let me ask you something,” he says. “Did you present this theory to Ben Rhodes?
  • “Oh, God,” Rhodes says. “The reason the president has bucked a lot of establishment thinking is because he does not agree with establishment thinking. Not because I or Denis McDonough are sitting here.” He pushes back in his chair. “The complete lack of governance in huge swaths of the Middle East, that is the project of the American establishment,” he declares. “That as much as Iraq is what angered me.
  • Ben Rhodes wanted to do right, and maybe, when the arc of history lands, it will turn out that he did. At least, he tried. Something scared him, and made him feel as if the grown-ups in Washington didn’t know what they were talking about, and it’s hard to argue that he was wrong.
  • What has interested me most about watching him and his cohort in the White House over the past seven years, I tell him, is the evolution of their ability to get comfortable with tragedy. I am thinking specifically about Syria, I add, where more than 450,000 people have been slaughtered.
  • “Yeah, I admit very much to that reality,” he says. “There’s a numbing element to Syria in particular. But I will tell you this,” he continues. “I profoundly do not believe that the United States could make things better in Syria by being there. And we have an evidentiary record of what happens when we’re there — nearly a decade in Iraq.
  • Iraq is his one-word answer to any and all criticism.
  • He mutters something about John Kerry, and then goes off the record, to suggest, in effect, that the world of the Sunni Arabs that the American establishment built has collapsed. The buck stops with the establishment, not with Obama, who was left to clean up their mess.
  • Rhodes walks me out into the sunlight of the West Wing parking lot, where we are treated to the sight of the aged Henry Kissinger, who has come to pay a visit. I ask Rhodes if he has ever met the famous diplomat before, and he tells me about the time they were seated together at a state dinner for the president of China. It was an interesting encounter to imagine, between Kissinger, who made peace with Mao’s China while bombing Laos to bits, and Rhodes, who helped effect a similar diplomatic volte-face with Iran but kept the United States out of a civil war in Syria, which has caused more than four million people to become refugees. I ask Rhodes how it felt being seated next to the embodiment of American realpolitik. “It was surreal,” he says, looking off into the middle distance. “I told him I was going to Laos,” he continues. “He got a weird look in his eye.
  • He is not Henry Kissinger, or so his logic runs, even as the underlying realist suspicion — or contempt — for the idea of America as a moral actor is eerily similar. He is torn. As the president himself once asked, how are we supposed to weigh the tens of thousands who have died in Syria against the tens of thousands who have died in Congo? What power means is that the choice is yours, no matter who is telling the story.
Javier E

How 9/11 changed us - Washington Post - 0 views

  • “The U.S. government must define what the message is, what it stands for,” the report asserts. “We should offer an example of moral leadership in the world, committed to treat people humanely, abide by the rule of law, and be generous and caring to our neighbors. . . . We need to defend our ideals abroad vigorously. America does stand up for its values.”
  • the authors pause to make a rousing case for the power of the nation’s character.
  • Rather than exemplify the nation’s highest values, the official response to 9/11 unleashed some of its worst qualities: deception, brutality, arrogance, ignorance, delusion, overreach and carelessness.
  • ...103 more annotations...
  • Reading or rereading a collection of such books today is like watching an old movie that feels more anguishing and frustrating than you remember. The anguish comes from knowing how the tale will unfold; the frustration from realizing that this was hardly the only possible outcome.
  • This conclusion is laid bare in the sprawling literature to emerge from 9/11 over the past two decades
  • Whatever individual stories the 9/11 books tell, too many describe the repudiation of U.S. values, not by extremist outsiders but by our own hand.
  • In these works, indifference to the growing terrorist threat gives way to bloodlust and vengeance after the attacks. Official dissembling justifies wars, then prolongs them. In the name of counterterrorism, security is politicized, savagery legalized and patriotism weaponized.
  • that state of exception became our new American exceptionalism.
  • The latest works on the legacy of 9/11 show how war-on-terror tactics were turned on religious groups, immigrants and protesters in the United States. The war on terror came home, and it walked in like it owned the place.
  • It happened fast. By 2004, when the 9/11 Commission urged America to “engage the struggle of ideas,” it was already too late; the Justice Department’s initial torture memos were already signed, the Abu Ghraib images had already eviscerated U.S. claims to moral authority.
  • “It is for now far easier for a researcher to explain how and why September 11 happened than it is to explain the aftermath,” Steve Coll writes in “Ghost Wars,” his 2004 account of the CIA’s pre-9/11 involvement in Afghanistan. Throughout that aftermath, Washington fantasized about remaking the world in its image, only to reveal an ugly image of itself to the world.
  • “We anticipate a black future for America,” bin Laden told ABC News more than three years before the 9/11 attacks. “Instead of remaining United States, it shall end up separated states and shall have to carry the bodies of its sons back to America.”
  • bin Laden also came to grasp, perhaps self-servingly, the benefits of luring Washington into imperial overreach, of “bleeding America to the point of bankruptcy,” as he put it in 2004, through endless military expansionism, thus beating back its global sway and undermining its internal unity.
  • To an unnerving degree, the United States moved toward the enemy’s fantasies of what it might become — a nation divided in its sense of itself, exposed in its moral and political compromises, conflicted over wars it did not want but would not end.
  • “The most frightening aspect of this new threat . . . was the fact that almost no one took it seriously. It was too bizarre, too primitive and exotic.” That is how Lawrence Wright depicts the early impressions of bin Laden and his terrorist network among U.S. officials
  • The books traveling that road to 9/11 have an inexorable, almost suffocating feel to them, as though every turn invariably leads to the first crush of steel and glass.
  • With the system “blinking red,” as CIA Director George Tenet later told the 9/11 Commission, why were all these warnings not enough? Wright lingers on bureaucratic failings
  • Clarke’s conclusion is simple, and it highlights America’s we-know-better swagger, a national trait that often masquerades as courage or wisdom. “America, alas, seems only to respond well to disasters, to be undistracted by warnings,” he writes. “Our country seems unable to do all that must be done until there has been some awful calamity.”
  • The problem with responding only to calamity is that underestimation is usually replaced by overreaction. And we tell ourselves it is the right thing, maybe the only thing, to do.
  • A last-minute flight change. A new job at the Pentagon. A retirement from the fire station. The final tilt of a plane’s wings before impact. If the books about the lead-up to 9/11 are packed with unbearable inevitability, the volumes on the day itself highlight how randomness separated survival from death.
  • Had the World Trade Center, built in the late 1960s and early 1970s, been erected according to the city building code in effect since 1938, Dwyer and Flynn explain, “it is likely that a very different world trade center would have been built.
  • Instead, it was constructed according to a new code that the real estate industry had avidly promoted, a code that made it cheaper and more lucrative to build and own skyscrapers. “It increased the floor space available for rent . . . by cutting back on the areas that had been devoted, under the earlier law, to evacuation and exit,” the authors write. The result: Getting everybody out on 9/11 was virtually impossible.
  • The towers embodied the power of American capitalism, but their design embodied the folly of American greed. On that day, both conditions proved fatal.
  • Garrett Graff quotes Defense Department officials marveling at how American Airlines Flight 77 struck a part of the Pentagon that, because of new anti-terrorism standards, had recently been reinforced and renovated
  • “In any other wedge of the Pentagon, there would have been 5,000 people, and the plane would have flown right through the middle of the building.” Instead, fewer than 200 people were killed in the attack on the Pentagon, including the passengers on the hijacked jet. Chance and preparedness came together.
  • The bravery of police and firefighters is the subject of countless 9/11 retrospectives, but these books also emphasize the selflessness of civilians who morphed into first responders
  • The passengers had made phone calls when the hijacking began and had learned the fate of other aircraft that day. “According to one call, they voted on whether to rush the terrorists in an attempt to retake the plane,” the commission report states. “They decided, and acted.”
  • The civilians aboard United Airlines Flight 93, whose resistance forced the plane to crash into a Pennsylvania field rather than the U.S. Capitol, were later lionized as emblems of swashbuckling Americana
  • Such episodes, led by ordinary civilians, embodied values that the 9/11 Commission called on the nation to display. Except those values would soon be dismantled, in the name of security, by those entrusted to uphold them.
  • Lawyering to death.The phrase appears in multiple 9/11 volumes, usually uttered by top officials adamant that they were going to get things done, laws and rules be damned
  • “I had to show the American people the resolve of a commander in chief that was going to do whatever it took to win,” Bush explains. “No yielding. No equivocation. No, you know, lawyering this thing to death.” In “Against All Enemies,” Clarke recalls the evening of Sept. 11, 2001, when Bush snapped at an official who suggested that international law looked askance at military force as a tool of revenge. “I don’t care what the international lawyers say, we are going to kick some ass,” the president retorted.
  • The message was unmistakable: The law is an obstacle to effective counterterrorism
  • Except, they did lawyer this thing to death. Instead of disregarding the law, the Bush administration enlisted it. “Beginning almost immediately after September 11, 2001, [Vice President Dick] Cheney saw to it that some of the sharpest and best-trained lawyers in the country, working in secret in the White House and the United States Department of Justice, came up with legal justifications for a vast expansion of the government’s power in waging war on terror,
  • Through public declarations and secret memos, the administration sought to remove limits on the president’s conduct of warfare and to deny terrorism suspects the protections of the Geneva Conventions by redefining them as unlawful enemy combatants. Nothing, Mayer argues of the latter effort, “more directly cleared the way for torture than this.”
  • Tactics such as cramped confinement, sleep deprivation and waterboarding were rebranded as “enhanced interrogation techniques,” legally and linguistically contorted to avoid the label of torture. Though the techniques could be cruel and inhuman, the OLC acknowledged in an August 2002 memo, they would constitute torture only if they produced pain equivalent to organ failure or death, and if the individual inflicting such pain really really meant to do so: “Even if the defendant knows that severe pain will result from his actions, if causing such harm is not his objective, he lacks the requisite specific intent.” It’s quite the sleight of hand, with torture moving from the body of the interrogated to the mind of the interrogator.
  • the memo concludes that none of it actually matters. Even if a particular interrogation method would cross some legal line, the relevant statute would be considered unconstitutional because it “impermissibly encroached” on the commander in chief’s authority to conduct warfare
  • You have informed us. Experts you have consulted. Based on your research. You do not anticipate. Such hand-washing words appear throughout the memos. The Justice Department relies on information provided by the CIA to reach its conclusions; the CIA then has the cover of the Justice Department to proceed with its interrogations. It’s a perfect circle of trust.
  • In these documents, lawyers enable lawlessness. Another May 2005 memo concludes that, because the Convention Against Torture applies only to actions occurring under U.S. jurisdiction, the CIA’s creation of detention sites in other countries renders the convention “inapplicable.”
  • avid Cole describes the documents as “bad-faith lawyering,” which might be generous. It is another kind of lawyering to death, one in which the rule of law that the 9/11 Commission urged us to abide by becomes the victim.
  • Similarly, because the Eighth Amendment’s prohibition on cruel and unusual punishment is meant to protect people convicted of crimes, it should not apply to terrorism detainees — because they have not been officially convicted of anything. The lack of due process conveniently eliminates constitutional protections
  • Years later, the Senate Intelligence Committee would investigate the CIA’s post-9/11 interrogation program. Its massive report — the executive summary of which appeared as a 549-page book in 2014 — found that torture did not produce useful intelligence, that the interrogations were more brutal than the CIA let on, that the Justice Department did not independently verify the CIA’s information, and that the spy agency impeded oversight by Congress and the CIA inspector general.
  • “The CIA’s effectiveness representations were almost entirely inaccurate,” the Senate report concluded. It is one of the few lies of the war on terror unmasked by an official government investigation and public report, but just one of the many documented in the 9/11 literature.
  • Officials in the war on terror didn’t deceive or dissemble just with lawmakers or the public. In the recurring tragedy of war, they lied just as often to themselves.
  • “The decision to invade Iraq was one made, finally and exclusively, by the president of the United States, George W. Bush,” he writes.
  • n Woodward’s “Bush at War,” the president admitted that before 9/11, “I didn’t feel that sense of urgency [about al-Qaeda], and my blood was not nearly as boiling.”
  • A president initially concerned about defending and preserving the nation’s moral goodness against terrorism found himself driven by darker impulses. “I’m having difficulty controlling my bloodlust,” Bush confessed to religious leaders in the Oval Office on Sept. 20, 2001,
  • Bloodlust, moral certainty and sudden vulnerability make a dangerous combination. The belief that you are defending good against evil can lead to the belief that whatever you do to that end is good, too.
  • Draper distills Bush’s worldview: “The terrorists’ primary objective was to destroy America’s freedom. Saddam hated America. Therefore, he hated freedom. Therefore, Saddam was himself a terrorist, bent on destroying America and its freedom.”
  • The president assumed the worst about what Hussein had done or might do, yet embraced best-case scenarios of how an American invasion would proceed.
  • “Iraqis would rejoice at the sight of their Western liberators,” Draper recaps. “Their newly shared sense of national purpose would overcome any sectarian allegiances. Their native cleverness would make up for their inexperience with self-government. They would welcome the stewardship of Iraqi expatriates who had not set foot in Baghdad in decades. And their oil would pay for everything.”
  • It did not seem to occur to Bush and his advisers that Iraqis could simultaneously hate Hussein and resent the Americans — feelings that could have been discovered by speaking to Iraqis and hearing their concerns.
  • few books on the war that gets deep inside Iraqis’ aversion to the Americans in their midst. “What gives them the right to change something that’s not theirs in the first place?” a woman in a middle-class Baghdad neighborhood asks him. “I don’t like your house, so I’m going to bomb it and you can rebuild it again the way I want it, with your money?
  • The occupation did not dissuade such impressions when it turned the former dictator’s seat of government into its own luxurious Green Zone, or when it retrofitted the Abu Ghraib prison (“the worst of Saddam’s hellholes,” Shadid calls it) into its own chamber of horrors.
  • Shadid hears early talk of the Americans as “kuffar” (heathens), a 51-year-old former teacher complains that “we’ve exchanged a tyrant for an occupier.”
  • Shadid understood that governmental legitimacy — who gets to rule, and by what right — was a matter of overriding importance for Iraqis. “The Americans never understood the question,” he writes; “Iraqis never agreed on the answer.
  • When the United States so quickly shifted from liberation to occupation, it lost whatever legitimacy it enjoyed. “Bush handed that enemy precisely what it wanted and needed, proof that America was at war with Islam, that we were the new Crusaders come to occupy Muslim land,” Clarke writes. “It was as if Usama bin Laden, hidden in some high mountain redoubt, were engaging in long-range mind control of George Bush, chanting ‘invade Iraq, you must invade Iraq.’ ”
  • The foolishness and arrogance of the American occupation didn’t help. In “Imperial Life in the Emerald City: Inside Iraq’s Green Zone,” Rajiv Chandrasekaran explains how, even as daily security was Iraqis’ overwhelming concern, viceroy L. Paul Bremer, Bush’s man in Baghdad, was determined to turn the country into a model free-market economy, complete with new investment laws, bankruptcy courts and a state-of-the-art stock exchange.
  • a U.S. Army general, when asked by local journalists why American helicopters must fly so low at night, thus scaring Iraqi children, replied that the kids were simply hearing “the sound of freedom.”Message: Freedom sounds terrifying.
  • For some Americans, inflicting that terror became part of the job, one more tool in the arsenal. In “The Forever War” by Dexter Filkins, a U.S. Army lieutenant colonel in Iraq assures the author that “with a heavy dose of fear and violence, and a lot of money for projects, I think we can convince these people that we are here to help them.”
  • Chandrasekaran recalls the response of a top communications official under Bremer, when reporters asked about waves of violence hitting Baghdad in the spring of 2004. “Off the record: Paris is burning,” the official told the journalists. “On the record: Security and stability are returning to Iraq.”
  • the Iraq War, conjured in part on the false connections between Iraq and al-Qaeda, ended up helping the terrorist network: It pulled resources from the war in Afghanistan, gave space for bin Laden’s men to regroup and spurred a new generation of terrorists in the Middle East. “A bigger gift to bin Laden was hard to imagine,” Bergen writes.
  • “U.S. officials had no need to lie or spin to justify the war,” Washington Post reporter Craig Whitlock writes in “The Afghanistan Papers,” a damning contrast of the war’s reality vs. its rhetoric. “Yet leaders at the White House, the Pentagon and the State Department soon began to make false assurances and to paper over setbacks on the battlefield.” As the years passed, the deceit became entrenched, what Whitlock calls “an unspoken conspiracy” to hide the truth.
  • Afghanistan was where al-Qaeda, supported by the Taliban, had made its base — it was supposed to be the good war, the right war, the war of necessity and not choice, the war endorsed at home and abroad.
  • If Iraq was the war born of lies, Afghanistan was the one nurtured by them
  • Whitlock finds commanding generals privately admitting that they long fought the war “without a functional strategy.” That, two years into the conflict, Rumsfeld complained that he had “no visibility into who the bad guys are.”
  • That Army Lt. Gen. Douglas Lute, a former coordinator of Iraq and Afghanistan policy, acknowledged that “we didn’t have the foggiest idea of what we were undertaking.”
  • That U.S. officials long wanted to withdraw American forces but feared — correctly so, it turns out — that the Afghan government might collapse. “Bin Laden had hoped for this exact scenario,” Whitlock observes. “To lure the U.S. superpower into an unwinnable guerrilla conflict that would deplete its national treasury and diminish its global influence.”
  • All along, top officials publicly contradicted these internal views, issuing favorable accounts of steady progress
  • Bad news was twisted into good: Rising suicide attacks in Kabul meant the Taliban was too weak for direct combat, for instance, while increased U.S. casualties meant America was taking the fight to the enemy.
  • deceptions transpired across U.S. presidents, but the Obama administration, eager to show that its first-term troop surge was working, “took it to a new level, hyping figures that were misleading, spurious or downright false,” Whitlock writes. And then under President Donald Trump, he adds, the generals felt pressure to “speak more forcefully and boast that his war strategy was destined to succeed.”
  • in public, almost no senior government officials had the courage to admit that the United States was slowly losing,” Whitlock writes. “With their complicit silence, military and political leaders avoided accountability and dodged reappraisals that could have changed the outcome or shortened the conflict.”
  • Deputy Secretary of State Richard Armitage traveled to Moscow shortly after 9/11 to give officials a heads up about the coming hostilities in Afghanistan. The Russians, recent visitors to the graveyard of empires, cautioned that Afghanistan was an “ambush heaven” and that, in the words of one of them, “you’re really going to get the hell kicked out of you.”
  • a war should not be measured only by the timing and the competence of its end. We still face an equally consequential appraisal: How good was this good war if it could be sustained only by lies?
  • In the two decades since the 9/11 attacks, the United States has often attempted to reconsider its response
  • They are written as though intending to solve problems. But they can be read as proof that the problems have no realistic solution, or that the only solution is to never have created them.
  • the report sets the bar for staying so high that an exit strategy appears to be its primary purpose.
  • he counterinsurgency manual is an extraordinary document. Implicitly repudiating notions such as “shock and awe” and “overwhelming force,” it argues that the key to battling an insurgency in countries such as Iraq and Afghanistan is to provide security for the local population and to win its support through effective governance
  • It also attempts to grasp the nature of America’s foes. “Most enemies either do not try to defeat the United States with conventional operations or do not limit themselves to purely military means,” the manual states. “They know that they cannot compete with U.S. forces on those terms. Instead, they try to exhaust U.S. national will.” Exhausting America’s will is an objective that al-Qaeda understood well.
  • “Counterinsurgents should prepare for a long-term commitment,” the manual states. Yet, just a few pages later, it admits that “eventually all foreign armies are seen as interlopers or occupiers.” How to accomplish the former without descending into the latter? No wonder so many of the historical examples of counterinsurgency that the manual highlights, including accounts from the Vietnam War, are stories of failure.
  • “Soldiers and Marines are expected to be nation builders as well as warriors,” the manual proclaims, but the arduous tasks involved — reestablishing government institutions, rebuilding infrastructure, strengthening local security forces, enforcing the rule of law — reveal the tension at the heart of the new doctrine
  • In his foreword, Army Lt. Col. John Nagl writes that the document’s most lasting impact may be as a catalyst not for remaking Iraq or Afghanistan, but for transforming the Army and Marine Corps into “more effective learning organizations,” better able to adapt to changing warfare. And in her introduction, Sarah Sewall, then director of Harvard’s Carr Center for Human Rights Policy, concludes that its “ultimate value” may be in warning civilian officials to think hard before engaging in a counterinsurgency campaign.
  • “The thing that got to everyone,” Finkel explains in the latter book, “was not having a defined front line. It was a war in 360 degrees, no front to advance toward, no enemy in uniform, no predictable patterns, no relief.” It’s a powerful summation of battling an insurgency.
  • Hitting the wrong house is what counterinsurgency doctrine is supposed to avoid. Even successfully capturing or killing a high-value target can be counterproductive if in the process you terrorize a community and create more enemies. In Iraq, the whole country was the wrong house. America’s leaders knew it was the wrong house. They hit it anyway.
  • Another returning soldier, Nic DeNinno, struggles to tell his wife about the time he and his fellow soldiers burst into an Iraqi home in search of a high-value target. He threw a man down the stairs and held another by the throat. After they left, the lieutenant told him it was the wrong house. “The wrong f---ing house,” Nic says to his wife. “One of the things I want to remember is how many times we hit the wrong house.”
  • “As time passes, more documents become available, and the bare facts of what happened become still clearer,” the report states. “Yet the picture of how those things happened becomes harder to reimagine, as that past world, with its preoccupations and uncertainty, recedes.” Before making definitive judgments, then, they ask themselves “whether the insights that seem apparent now would really have been meaningful at the time.”
  • Two of the latest additions to the canon, “Reign of Terror” by Spencer Ackerman and “Subtle Tools” by Karen Greenberg, draw straight, stark lines between the earliest days of the war on terror and its mutations in our current time, between conflicts abroad and divisions at home. These works show how 9/11 remains with us, and how we are still living in the ruins.
  • When Trump declared that “we don’t have victories anymore” in his 2015 speech announcing his presidential candidacy, he was both belittling the legacy of 9/11 and harnessing it to his ends. “His great insight was that the jingoistic politics of the War on Terror did not have to be tied to the War on Terror itself,” Ackerman writes. “That enabled him to tell a tale of lost greatness.” And if greatness is lost, someone must have taken it.
  • “Trump had learned the foremost lesson of 9/11,” Ackerman writes, “that the terrorists were whomever you said they were.”
  • The backlash against Muslims, against immigrants crossing the southern border and against protesters rallying for racial justice was strengthened by the open-ended nature of the global war on terror.
  • the war is not just far away in Iraq or Afghanistan, in Yemen or Syria, but it’s happening here, with mass surveillance, militarized law enforcement and the rebranding of immigration as a threat to the nation’s security rather than a cornerstone of its identity
  • the Authorization for Use of Military Force, drafted by administration lawyers and approved by Congress just days after the attacks, as the moment when America’s response began to go awry. The brief joint resolution allowed the president to use “all necessary and appropriate force” against any nation, organization or person who committed the attacks, and to prevent any future ones.
  • It was the “Ur document in the war on terror and its legacy,” Greenberg writes. “Riddled with imprecision, its terminology was geared to codify expansive powers.” Where the battlefield, the enemy and the definition of victory all remain vague, war becomes endlessly expansive, “with neither temporal nor geographical boundaries.”
  • This was the moment the war on terror was “conceptually doomed,” Ackerman concludes. This is how you get a forever war.
  • There were moments when an off-ramp was visible. The killing of bin Laden in 2011 was one such instance, Ackerman argues, but “Obama squandered the best chance anyone could ever have to end the 9/11 era.”
  • The author assails Obama for making the war on terror more “sustainable” through a veneer of legality — banning torture yet failing to close the detention camp at Guantánamo Bay and relying on drone strikes that “perversely incentivized the military and the CIA to kill instead of capture.”
  • There would always be more targets, more battlefields, regardless of president or party. Failures became the reason to double down, never wind down.
  • The longer the war went on, the more that what Ackerman calls its “grotesque subtext” of nativism and racism would move to the foreground of American politics
  • Absent the war on terror, it is harder to imagine a presidential candidate decrying a sitting commander in chief as foreign, Muslim, illegitimate — and using that lie as a successful political platform.
  • Absent the war on terror, it is harder to imagine a travel ban against people from Muslim-majority countries. Absent the war on terror, it is harder to imagine American protesters labeled terrorists, or a secretary of defense describing the nation’s urban streets as a “battle space” to be dominated
  • In his latest book on bin Laden, Bergen argues that 9/11 was a major tactical success but a long-term strategic failure for the terrorist leader. Yes, he struck a vicious blow against “the head of the snake,” as he called the United States, but “rather than ending American influence in the Muslim world, the 9/11 attacks greatly amplified it,” with two lengthy, large-scale invasions and new bases established throughout the region.
  • “A vastly different America has taken root” in the two decades since 9/11, Greenberg writes. “In the name of retaliation, ‘justice,’ and prevention, fundamental values have been cast aside.”
  • the legacy of the 9/11 era is found not just in Afghanistan or Iraq, but also in an America that drew out and heightened some of its ugliest impulses — a nation that is deeply divided (like those “separated states” bin Laden imagined); that bypasses inconvenient facts and embraces conspiracy theories; that demonizes outsiders; and that, after failing to spread freedom and democracy around the world, seems less inclined to uphold them here
  • Seventeen years after the 9/11 Commission called on the United States to offer moral leadership to the world and to be generous and caring to our neighbors, our moral leadership is in question, and we can barely be generous and caring to ourselves.
  • Still reeling from an attack that dropped out of a blue sky, America is suffering from a sort of post-traumatic stress democracy. It remains in recovery, still a good country, even if a broken good country.
  • 9/11 was a test. Thebooks of the lasttwo decades showhow America failed.
  • Deep within the catalogue of regrets that is the 9/11 Commission report
Javier E

An Ancient Guide to the Good Life | The New Yorker - 0 views

  • What’s striking about AITA is the language in which it states its central question: you’re asked not whether I did the right thing but, rather, what sort of person I’m being.
  • We would have a different morality, and an impoverished one, if we judged actions only with those terms of pure evaluation, “right” or “wrong,” and judged people only “good” or “bad.”
  • , if Aristotle’s ethics is to be sold as a work of what we call self-help, we have to ask: How helpful is it?
  • ...40 more annotations...
  • Our vocabulary of commendation and condemnation is perpetually changing, but it has always relied on “thick” ethical terms, which combine description and evaluation.
  • “How to flourish” was one such topic, “flourishing” being a workable rendering of Aristotle’s term eudaimonia. We might also translate the term in the usual way, as “happiness,” as long as we suspend some of that word’s modern associations; eudaimonia wasn’t something that waxed and waned with our moods
  • For Aristotle, ethics was centrally concerned with how to live a good life: a flourishing existence was also a virtuous one.
  • “famously terse, often crabbed in their style.” Crabbed, fragmented, gappy: it can be a headache trying to match his pronouns to the nouns they refer to. Some of his arguments are missing crucial premises; others fail to spell out their conclusions.
  • Aristotle is obscure in other ways, too. His highbrow potshots at unnamed contemporaries, his pop-cultural references, must have tickled his aristocratic Athenian audience. But the people and the plays he referred to are now lost or forgotten. Some readers have found his writings “affectless,” stripped of any trace of a human voice, or of a beating human heart.
  • Flourishing is the ultimate goal of human life; a flourishing life is one that is lived in accord with the various “virtues” of the character and intellect (courage, moderation, wisdom, and so forth); a flourishing life also calls for friendships with good people and a certain measure of good fortune in the way of a decent income, health, and looks.
  • much of what it says can sound rather obvious
  • Virtue is not just about acting rightly but about feeling rightly. What’s best, Aristotle says, is “to have such feelings at the right time, at the right objects and people, with the right goal, and in the right manner.” Good luck figuring out what the “right time” or object or manner is.
  • Virtue is a state “consisting in a mean,” Aristotle maintains, and this mean “is defined by reference to reason, that is to say, to the reason by reference to which the prudent person would define it.
  • The phrase “prudent person” here renders the Greek phronimos, a person possessed of that special quality of mind which Aristotle called “phronesis.” But is Aristotle then saying that virtue consists in being disposed to act as the virtuous person does? That sounds true, but trivially so.
  • it helps to reckon with the role that habits of mind play in Aristotle’s account. Meyer’s translation of “phronesis” is “good judgment,” and the phrase nicely captures the combination of intelligence and experience which goes into acquiring it, along with the difficulty of reducing it to a set of explicit principles that anyone could apply mechanically, like an algorithm.
  • “good judgment” is an improvement on the old-fashioned and now misleading “prudence”; it’s also less clunky than another standby, “practical wisdom.”
  • The enormous role of judgment in Aristotle’s picture of how to live can sound, to modern readers thirsty for ethical guidance, like a cop-out. Especially when they might instead pick up a treatise by John Stuart Mill and find an elegantly simple principle for distinguishing right from wrong, or one by Kant, in which they will find at least three. They might, for that matter, look to Jordan Peterson, who conjures up as many as twelve.
  • the question of how to flourish could receive a gloomy answer from Aristotle: it may be too late to start trying. Why is that? Flourishing involves, among other things, performing actions that manifest virtues, which are qualities of character that enable us to perform what Aristotle calls our “characteristic activity
  • But how do we come to acquire these qualities of character, or what Meyer translates as “dispositions”? Aristotle answers, “From our regular practice.”
  • In a passage missing from Meyer’s ruthless abridgment, Aristotle warns, “We need to have been brought up in noble habits if we are to be adequate students of noble and just things. . . . For we begin from the that; if this is apparent enough to us, we can begin without also knowing why. Someone who is well brought up has the beginnings, or can easily acquire them.”
  • Aristotle suggests, more generally, that you should identify the vices you’re susceptible to and then “pull yourself away in the opposite direction, since by pulling hard against one fault, you get to the mean (as when straightening out warped planks).
  • Sold as a self-help manual in a culture accustomed to gurus promulgating “rules for living,” Aristotle’s ethics may come as a disappointment. But our disappointment may tell us more about ourselves than it does about Aristotle.
  • Michael Oakeshott wrote that “nobody supposes that the knowledge that belongs to the good cook is confined to what is or may be written down in the cookery book.” Proficiency in cooking is, of course, a matter of technique
  • My tutor’s fundamental pedagogical principle was that to teach a text meant being, at least for the duration of the tutorial, its most passionate champion. Every smug undergraduate exposé of a fallacy would be immediately countered with a robust defense of Aristotle’s reasoning.
  • “How to read Aristotle? Slowly.”
  • I was never slow enough. There was always another nuance, another textual knot to unravel
  • Sometimes we acquire our skills by repeatedly applying a rule—following a recipe—but when we succeed what we become are not good followers of recipes but good cooks. Through practice, as Aristotle would have said, we acquire judgment.
  • What we were doing with this historical text wasn’t history but philosophy. We were reading it not for what it might reveal about an exotic culture but for the timelessly important truths it might contain—an attitude at odds with the relativism endemic in the rest of the humanities.
  • There is no shortcut to understanding Aristotle, no recipe. You get good at reading him by reading him, with others, slowly and often. Regular practice: for Aristotle, it’s how you get good generally.
  • “My parents taught me the difference between right and wrong,” he said, “and I can’t think what more there is to say about it.” The appropriate response, and the Aristotelian one, would be to agree with the spirit of the remark. There is such a thing as the difference between right and wrong. But reliably telling them apart takes experience, the company of wise friends, and the good luck of having been well brought u
  • we are all Aristotelians, most of the time, even when forces in our culture briefly persuade us that we are something else. Ethics remains what it was to the Greeks: a matter of being a person of a certain sort of sensibility, not of acting on “principles,” which one reserves for unusual situations of the kind that life sporadically throws up
  • That remains a truth about ethics even when we’ve adopted different terms for describing what type of person not to be: we don’t speak much these days of being “small-souled” or “intemperate,” but we do say a great deal about “douchebags,” “creeps,” and, yes, “assholes.
  • In one sense, it tells us nothing that the right thing to do is to act and feel as the person of good judgment does. In another sense, it tells us virtually everything that can be said at this level of generality.
  • If self-help means denying the role that the perceptions of others play in making us who we are, if it means a set of rules for living that remove the need for judgment, then we are better off without it.
  • Aristotle had little hope that a philosopher’s treatise could teach someone without much experience of life how to make the crucial ethical distinctions. We learn to spot an “asshole” from living; how else
  • when our own perceptions falter, we continue to do today exactly what Aristotle thought we should do. He asserts, in another significant remark that doesn’t make Meyer’s cut, that we should attend to the words of the old and experienced at least as much as we do to philosophical proofs: “these people see correctly because experience has given them their eye.”
  • Is it any surprise that the Internet is full of those who need help seeing rightly? Finding no friendly neighborhood phronimos to provide authoritative advice, you defer instead to the wisdom of an online community.
  • “The self-made man,” Oakeshott wrote, “is never literally self-made, but depends upon a certain kind of society and upon a large unrecognized inheritance.”
  • It points us in the right direction: toward the picture of a person with a certain character, certain habits of thinking and feeling, a certain level of self-knowledge and knowledge of other people.
  • We have long lived in a world desperate for formulas, simple answers to the simple question “What should I do?”
  • the algorithms, the tenets, the certificates are all attempts to solve the problem—which is everybody’s problem—of how not to be an asshole. Life would be a lot easier if there were rules, algorithms, and life hacks solving that problem once and for all. There aren’t.
  • At the heart of the Nicomachean Ethics is a claim that remains both edifying and chastening: phronesis doesn’t come that easy. Aristotle devised a theory that was vague in just the right places, one that left, intentionally, space to be filled in by life. 
  • Twenty-four centuries later, we’re still guided by the approach toward ethical life that Aristotle exemplified, one in which the basic question is not what we do but who we are
  • The Internet has no shortage of moralists and moralizers, but one ethical epicenter is surely the extraordinary, addictive subreddit called “Am I the Asshole?,” popularly abbreviated AITA
Javier E

Ozempic or Bust - The Atlantic - 0 views

  • June 2024 Issue
  • Explore
  • it is impossible to know, in the first few years of any novel intervention, whether its success will last.
  • ...77 more annotations...
  • The ordinary fixes—the kind that draw on people’s will, and require eating less and moving more—rarely have a large or lasting effect. Indeed, America itself has suffered through a long, maddening history of failed attempts to change its habits on a national scale: a yo-yo diet of well-intentioned treatments, policies, and other social interventions that only ever lead us back to where we started
  • Through it all, obesity rates keep going up; the diabetes epidemic keeps worsening.
  • The most recent miracle, for Barb as well as for the nation, has come in the form of injectable drugs. In early 2021, the Danish pharmaceutical company Novo Nordisk published a clinical trial showing remarkable results for semaglutide, now sold under the trade names Wegovy and Ozempic.
  • Patients in the study who’d had injections of the drug lost, on average, close to 15 percent of their body weight—more than had ever been achieved with any other drug in a study of that size. Wadden knew immediately that this would be “an incredible revolution in the treatment of obesity.”
  • Many more drugs are now racing through development: survodutide, pemvidutide, retatrutide. (Among specialists, that last one has produced the most excitement: An early trial found an average weight loss of 24 percent in one group of participants.
  • In the United States, an estimated 189 million adults are classified as having obesity or being overweight
  • The drugs don’t work for everyone. Their major side effects—nausea, vomiting, and diarrhea—can be too intense for many patients. Others don’t end up losing any weight
  • For the time being, just 25 percent of private insurers offer the relevant coverage, and the cost of treatment—about $1,000 a month—has been prohibitive for many Americans.
  • The drugs have already been approved not just for people with diabetes or obesity, but for anyone who has a BMI of more than 27 and an associated health condition, such as high blood pressure or cholesterol. By those criteria, more than 140 million American adults already qualify
  • if this story goes the way it’s gone for other “risk factor” drugs such as statins and antihypertensives, then the threshold for prescriptions will be lowered over time, inching further toward the weight range we now describe as “normal.”
  • How you view that prospect will depend on your attitudes about obesity, and your tolerance for risk
  • The first GLP-1 drug to receive FDA approval, exenatide, has been used as a diabetes treatment for more than 20 years. No long-term harms have been identified—but then again, that drug’s long-term effects have been studied carefully only across a span of seven years
  • the data so far look very good. “These are now being used, literally, in hundreds of thousands of people across the world,” she told me, and although some studies have suggested that GLP-1 drugs may cause inflammation of the pancreas, or even tumor growth, these concerns have not borne out.
  • adolescents are injecting newer versions of these drugs, and may continue to do so every week for 50 years or more. What might happen over all that time?
  • “All of us, in the back of our minds, always wonder, Will something show up?  ” Although no serious problems have yet emerged, she said, “you wonder, and you worry.”
  • in light of what we’ve been through, it’s hard to see what other choices still remain. For 40 years, we’ve tried to curb the spread of obesity and its related ailments, and for 40 years, we’ve failed. We don’t know how to fix the problem. We don’t even understand what’s really causing it. Now, again, we have a new approach. This time around, the fix had better work.
  • The fen-phen revolution arrived at a crucial turning point for Wadden’s field, and indeed for his career. By then he’d spent almost 15 years at the leading edge of research into dietary interventions, seeing how much weight a person might lose through careful cutting of their calories.
  • But that sort of diet science—and the diet culture that it helped support—had lately come into a state of ruin. Americans were fatter than they’d ever been, and they were giving up on losing weight. According to one industry group, the total number of dieters in the country declined by more than 25 percent from 1986 to 1991.
  • Rejecting diet culture became something of a feminist cause. “A growing number of women are joining in an anti-diet movement,” The New York Times reported in 1992. “They are forming support groups and ceasing to diet with a resolve similar to that of secretaries who 20 years ago stopped getting coffee for their bosses.
  • Now Wadden and other obesity researchers were reaching a consensus that behavioral interventions might produce in the very best scenario an average lasting weight loss of just 5 to 10 percent
  • National surveys completed in 1994 showed that the adult obesity rate had surged by more than half since 1980, while the proportion of children classified as overweight had doubled. The need for weight control in America had never seemed so great, even as the chances of achieving it were never perceived to be so small.
  • Wadden wasn’t terribly concerned, because no one in his study had reported any heart symptoms. But ultrasounds revealed that nearly one-third of them had some degree of leakage in their heart valves. His “cure for obesity” was in fact a source of harm.
  • In December 1994, the Times ran an editorial on what was understood to be a pivotal discovery: A genetic basis for obesity had finally been found. Researchers at Rockefeller University were investigating a molecule, later named leptin, that gets secreted from fat cells and travels to the brain, and that causes feelings of satiety. Lab mice with mutations in the leptin gene—importantly, a gene also found in humans—overeat until they’re three times the size of other mice. “The finding holds out the dazzling hope,”
  • In April 1996, the doctors recommended yes: Dexfenfluramine was approved—and became an instant blockbuster. Patients received prescriptions by the hundreds of thousands every month. Sketchy wellness clinics—call toll-free, 1-888-4FEN-FEN—helped meet demand. Then, as now, experts voiced concerns about access. Then, as now, they worried that people who didn’t really need the drugs were lining up to take them. By the end of the year, sales of “fen” alone had surpassed $300 million.
  • It was nothing less than an awakening, for doctors and their patients alike. Now a patient could be treated for excess weight in the same way they might be treated for diabetes or hypertension—with a drug they’d have to take for the rest of their life.
  • the article heralded a “new understanding of obesity as a chronic disease rather than a failure of willpower.”
  • News had just come out that, at the Mayo Clinic in Minnesota, two dozen women taking fen-phen—including six who were, like Barb, in their 30s—had developed cardiac conditions. A few had needed surgery, and on the operating table, doctors discovered that their heart valves were covered with a waxy plaque.
  • Americans had been prescribed regular fenfluramine since 1973, and the newer drug, dexfenfluramine, had been available in France since 1985. Experts took comfort in this history. Using language that is familiar from today’s assurances regarding semaglutide and other GLP-1 drugs, they pointed out that millions were already on the medication. “It is highly unlikely that there is anything significant in toxicity to the drug that hasn’t been picked up with this kind of experience,” an FDA official named James Bilstad would later say in a Time cover story headlined “The Hot New Diet Pill.
  • “I know I can’t get any more,” she told Williams. “I have to use up what I have. And then I don’t know what I’m going to do after that. That’s the problem—and that is what scares me to death.” Telling people to lose weight the “natural way,” she told another guest, who was suggesting that people with obesity need only go on low-carb diets, is like “asking a person with a thyroid condition to just stop their medication.”
  • She’d gone off the fen-phen and had rapidly regained weight. “The voices returned and came back in a furor I’d never heard before,” Barb later wrote on her blog. “It was as if they were so angry at being silenced for so long, they were going to tell me 19 months’ worth of what they wanted me to hear. I was forced to listen. And I ate. And I ate. And ate.”
  • For Barb, rapid weight loss has brought on a different metaphysical confusion. When she looks in the mirror, she sometimes sees her shape as it was two years ago. In certain corners of the internet, this is known as “phantom fat syndrome,” but Barb dislikes that term. She thinks it should be called “body integration syndrome,” stemming from a disconnect between your “larger-body memory” and “smaller-body reality.
  • In 2003, the U.S. surgeon general declared obesity “the terror within, a threat that is every bit as real to America as the weapons of mass destruction”; a few months later, Eric Finkelstein, an economist who studies the social costs of obesity, put out an influential paper finding that excess weight was associated with up to $79 billion in health-care spending in 1998, of which roughly half was paid by Medicare and Medicaid. (Later he’d conclude that the number had nearly doubled in a decade.
  • In 2004, Finkelstein attended an Action on Obesity summit hosted by the Mayo Clinic, at which numerous social interventions were proposed, including calorie labeling in workplace cafeterias and mandatory gym class for children of all grades.
  • he message at their core, that soda was a form of poison like tobacco, spread. In San Francisco and New York, public-service campaigns showed images of soda bottles pouring out a stream of glistening, blood-streaked fat. Michelle Obama led an effort to depict water—plain old water—as something “cool” to drink.
  • Soon, the federal government took up many of the ideas that Brownell had helped popularize. Barack Obama had promised while campaigning for president that if America’s obesity trends could be reversed, the Medicare system alone would save “a trillion dollars.” By fighting fat, he implied, his ambitious plan for health-care reform would pay for itself. Once he was in office, his administration pulled every policy lever it could.
  • Michelle Obama helped guide these efforts, working with marketing experts to develop ways of nudging kids toward better diets and pledging to eliminate “food deserts,” or neighborhoods that lacked convenient access to healthy, affordable food. She was relentless in her public messaging; she planted an organic garden at the White House and promoted her signature “Let’s Move!” campaign around the country.
  • An all-out war on soda would come to stand in for these broad efforts. Nutrition studies found that half of all Americans were drinking sugar-sweetened beverages every day, and that consumption of these accounted for one-third of the added sugar in adults’ diets. Studies turned up links between people’s soft-drink consumption and their risks for type 2 diabetes and obesity. A new strand of research hinted that “liquid calories” in particular were dangerous to health.
  • when their field lost faith in low-calorie diets as a source of lasting weight loss, the two friends went in opposite directions. Wadden looked for ways to fix a person’s chemistry, so he turned to pharmaceuticals. Brownell had come to see obesity as a product of our toxic food environment: He meant to fix the world to which a person’s chemistry responded, so he started getting into policy.
  • The social engineering worked. Slowly but surely, Americans’ lamented lifestyle began to shift. From 2001 to 2018, added-sugar intake dropped by about one-fifth among children, teens, and young adults. From the late 1970s through the early 2000s, the obesity rate among American children had roughly tripled; then, suddenly, it flattened out.
  • although the obesity rate among adults was still increasing, its climb seemed slower than before. Americans’ long-standing tendency to eat ever-bigger portions also seemed to be abating.
  • sugary drinks—liquid candy, pretty much—were always going to be a soft target for the nanny state. Fixing the food environment in deeper ways proved much harder. “The tobacco playbook pretty much only works for soda, because that’s the closest analogy we have as a food item,
  • that tobacco playbook doesn’t work to increase consumption of fruits and vegetables, he said. It doesn’t work to increase consumption of beans. It doesn’t work to make people eat more nuts or seeds or extra-virgin olive oil.
  • Careful research in the past decade has shown that many of the Obama-era social fixes did little to alter behavior or improve our health. Putting calorie labels on menus seemed to prompt at most a small decline in the amount of food people ate. Employer-based wellness programs (which are still offered by 80 percent of large companies) were shown to have zero tangible effects. Health-care spending, in general, kept going up.
  • From the mid-1990s to the mid-2000s, the proportion of adults who said they’d experienced discrimination on account of their height or weight increased by two-thirds, going up to 12 percent. Puhl and others started citing evidence that this form of discrimination wasn’t merely a source of psychic harm, but also of obesity itself. Studies found that the experience of weight discrimination is associated with overeating, and with the risk of weight gain over time.
  • obesity rates resumed their ascent. Today, 20 percent of American children have obesity. For all the policy nudges and the sensible revisions to nutrition standards, food companies remain as unfettered as they were in the 1990s, Kelly Brownell told me. “Is there anything the industry can’t do now that it was doing then?” he asked. “The answer really is no. And so we have a very predictable set of outcomes.”
  • she started to rebound. The openings into her gastric pouch—the section of her stomach that wasn’t bypassed—stretched back to something like their former size. And Barb found ways to “eat around” the surgery, as doctors say, by taking food throughout the day in smaller portions
  • Bariatric surgeries can be highly effective for some people and nearly useless for others. Long-term studies have found that 30 percent of those who receive the same procedure Barb did regain at least one-quarter of what they lost within two years of reaching their weight nadir; more than half regain that much within five years.
  • if the effects of Barb’s surgery were quickly wearing off, its side effects were not: She now had iron, calcium, and B12 deficiencies resulting from the changes to her gut. She looked into getting a revision of the surgery—a redo, more or less—but insurance wouldn’t cover it
  • She found that every health concern she brought to doctors might be taken as a referendum, in some way, on her body size. “If I stubbed my toe or whatever, they’d just say ‘Lose weight.’ ” She began to notice all the times she’d be in a waiting room and find that every chair had arms. She realized that if she was having a surgical procedure, she’d need to buy herself a plus-size gown—or else submit to being covered with a bedsheet when the nurses realized that nothing else would fit.
  • Barb grew angrier and more direct about her needs—You’ll have to find me a different chair, she started saying to receptionists. Many others shared her rage. Activists had long decried the cruel treatment of people with obesity: The National Association to Advance Fat Acceptance had existed, for example, in one form or another, since 1969; the Council on Size & Weight Discrimination had been incorporated in 1991. But in the early 2000s, the ideas behind this movement began to wend their way deeper into academia, and they soon gained some purchase with the public.
  • “Our public-health efforts to address obesity have failed,” Eric Finkelstein, the economist, told me.
  • Others attacked the very premise of a “healthy weight”: People do not have any fundamental need, they argued, morally or medically, to strive for smaller bodies as an end in itself. They called for resistance to the ideology of anti-fatness, with its profit-making arms in health care and consumer goods. The Association for Size Diversity and Health formed in 2003; a year later, dozens of scholars working on weight-related topics joined together to create the academic field of fat studies.
  • As the size-diversity movement grew, its values were taken up—or co-opted—by Big Business. Dove had recently launched its “Campaign for Real Beauty,” which included plus-size women. (Ad Age later named it the best ad campaign of the 21st century.) People started talking about “fat shaming” as something to avoid
  • By 2001, Bacon, who uses they/them pronouns, had received their Ph.D. and finished a rough draft of a book, Health at Every Size, which drew inspiration from a broader movement by that name among health-care practitioners
  • But something shifted in the ensuing years. In 2007, Bacon got a different response, and the book was published. Health at Every Size became a point of entry for a generation of young activists and, for a time, helped shape Americans’ understanding of obesity.
  • Some experts were rethinking their advice on food and diet. At UC Davis, a physiologist named Lindo Bacon who had struggled to overcome an eating disorder had been studying the effects of “intuitive eating,” which aims to promote healthy, sustainable behavior without fixating on what you weigh or how you look
  • The heightened sensitivity started showing up in survey data, too. In 2010, fewer than half of U.S. adults expressed support for giving people with obesity the same legal protections from discrimination offered to people with disabilities. In 2015, that rate had risen to three-quarters.
  • In Bacon’s view, the 2000s and 2010s were glory years. “People came together and they realized that they’re not alone, and they can start to be critical of the ideas that they’ve been taught,” Bacon told me. “We were on this marvelous path of gaining more credibility for the whole Health at Every Size movement, and more awareness.”
  • that sense of unity proved short-lived; the movement soon began to splinter. Black women have the highest rates of obesity, and disproportionately high rates of associated health conditions. Yet according to Fatima Cody Stanford, an obesity-medicine physician at Harvard Medical School, Black patients with obesity get lower-quality care than white patients with obesity.
  • That system was exactly what Bacon and the Health at Every Size movement had set out to reform. The problem, as they saw it, was not so much that Black people lacked access to obesity medicine, but that, as Bacon and the Black sociologist Sabrina Strings argued in a 2020 article, Black women have been “specifically targeted” for weight loss, which Bacon and Strings saw as a form of racism
  • But members of the fat-acceptance movement pointed out that their own most visible leaders, including Bacon, were overwhelmingly white. “White female dietitians have helped steal and monetize the body positive movement,” Marquisele Mercedes, a Black activist and public-health Ph.D. student, wrote in September 2020. “And I’m sick of it.”
  • Tensions over who had the standing to speak, and on which topics, boiled over. In 2022, following allegations that Bacon had been exploitative and condescending toward Black colleagues, the Association for Size Diversity and Health expelled them from its ranks and barred them from attending its events.
  • As the movement succumbed to in-fighting, its momentum with the public stalled. If attitudes about fatness among the general public had changed during the 2000s and 2010s, it was only to a point. The idea that some people can indeed be “fit but fat,” though backed up by research, has always been a tough sell.
  • Although Americans had become less inclined to say they valued thinness, measures of their implicit attitudes seemed fairly stable. Outside of a few cities such as San Francisco and Madison, Wisconsin, new body-size-discrimination laws were never passed.
  • In the meantime, thinness was coming back into fashion
  • In the spring of 2022, Kim Kardashian—whose “curvy” physique has been a media and popular obsession—boasted about crash-dieting in advance of the Met Gala. A year later, the model and influencer Felicity Hayward warned Vogue Business that “plus-size representation has gone backwards.” In March of this year, the singer Lizzo, whose body pride has long been central to her public persona, told The New York Times that she’s been trying to lose weight. “I’m not going to lie and say I love my body every day,” she said.
  • Among the many other dramatic effects of the GLP-1 drugs, they may well have released a store of pent-up social pressure to lose weight.
  • If ever there was a time to debate that impulse, and to question its origins and effects, it would be now. But Puhl told me that no one can even agree on which words are inoffensive. The medical field still uses obesity, as a description of a diagnosable disease. But many activists despise that phrase—some spell it with an asterisk in place of the e—and propose instead to reclaim fat.
  • Everyone seems to agree on the most important, central fact: that we should be doing everything we can to limit weight stigma. But that hasn’t been enough to stop the arguing.
  • Things feel surreal these days to just about anyone who has spent years thinking about obesity. At 71, after more than four decades in the field, Thomas Wadden now works part-time, seeing patients just a few days a week. But the arrival of the GLP-1 drugs has kept him hanging on for a few more years, he said. “It’s too much of an exciting period to leave obesity research right now.”
  • When everyone is on semaglutide or tirzepatide, will the soft-drink companies—Brownell’s nemeses for so many years—feel as if a burden has been lifted? “My guess is the food industry is probably really happy to see these drugs come along,” he said. They’ll find a way to reach the people who are taking GLP‑1s, with foods and beverages in smaller portions, maybe. At the same time, the pressures to cut back on where and how they sell their products will abate.
  • the triumph in obesity treatment only highlights the abiding mystery of why Americans are still getting fatter, even now
  • Perhaps one can lay the blame on “ultraprocessed” foods, he said. Maybe it’s a related problem with our microbiomes. Or it could be that obesity, once it takes hold within a population, tends to reproduce itself through interactions between a mother and a fetus. Others have pointed to increasing screen time, how much sleep we get, which chemicals are in the products that we use, and which pills we happen to take for our many other maladies.
  • “The GLP-1s are just a perfect example of how poorly we understand obesity,” Mozaffarian told me. “Any explanation of why they cause weight loss is all post-hoc hand-waving now, because we have no idea. We have no idea why they really work and people are losing weight.”
  • The new drugs—and the “new understanding of obesity” that they have supposedly occasioned—could end up changing people’s attitudes toward body size. But in what ways
  • When the American Medical Association declared obesity a disease in 2013, Rebecca Puhl told me, some thought “it might reduce stigma, because it was putting more emphasis on the uncontrollable factors that contribute to obesity.” Others guessed that it would do the opposite, because no one likes to be “diseased.”
  • why wasn’t there another kind of nagging voice that wouldn’t stop—a sense of worry over what the future holds? And if she wasn’t worried for herself, then what about for Meghann or for Tristan, who are barely in their 40s? Wouldn’t they be on these drugs for another 40 years, or even longer? But Barb said she wasn’t worried—not at all. “The technology is so much better now.” If any problems come up, the scientists will find solutions.
Javier E

Japanese Mind: Understanding Contemporary Japanese Culture (Roger J. Davies and Osamu I... - 0 views

  • Japan, the need for strong emotional unity has also resulted in an inability to criticize others openly. As a consequence, the development of ambiguity can be viewed as a defining characteristic of the Japanese style of communication: Japanese conversation does not take the form of dialectic development. The style of conversation is almost always fixed from beginning to end depending on the human relationship. It is one-way, like a lecture, or an inconclusive argument going along parallel lines or making a circle round and round, and in the end still ending up mostly at the beginning. This style
  • To express oneself distinctly carries the assumption that one’s partner knows nothing, so clear expression can be considered impolite.
  • own customs. Japanese people, too, have their own opinions, but they tend to wait their turn to speak out. If they completely disagree with a speaker, they will usually listen with an air of acceptance at first, then disagree in a rather vague and roundabout way. This is considered the polite way to do things in Japan. On
  • ...52 more annotations...
  • In Japan, however, if you go against someone and create a bad atmosphere, your relations may break off completely. People tend to react emotionally, and most are afraid of being excluded from the group.
  • For the Japanese, silence indicates deep thinking or consideration, but too much silence often makes non-Japanese uncomfortable. Whereas the Japanese consider silence as rather good and people generally feel sympathetic toward it, non-Japanese sometimes feel that it is an indication of indifference or apathy.
  • The concept of amae greatly affects all aspects of Japanese life because it is related to other characteristics of the Japanese way of thinking, such as enryo (restraint), giri (social obligation), tsumi (sin), haji (shame) (Doi, 1973, pp. 33–48).
  • In other words, in the inner circle, amae is at work and there is no enryo, in the middle zone enryo is present, and in the outer circle, which is the world of strangers, there is neither amae nor enryo.
  • they feel giri (obligation) when others, toward whom they have enryo (restraint), show kindness to them. However, they do not express their appreciation as much to people they are close to and with whom they can amaeru
  • Japanese have difficulty saying no, in contrast to Westerners, who are able to do so more easily. The reason for this is that Japanese relationships, which are based on amae, are unstable (Doi; cited in Sahashi, 1980, p. 79); that is, people hesitate to refuse others for fear of breaking this bond. Doi insists that Westerners can refuse easily because amae is not at work in their relationships
  • Hirayama and Takashina (1994, pp. 22–23) state, for example, that the Japanese sense of beauty is based on a concept known as mono no aware, a kind of aesthetic value that comes from feelings, while in Western art, people try to construct something of beauty with a logic of what is beautiful. In contrast, Japanese art focuses not on what is logically considered beautiful, but on what people feel is beautiful. The Japanese aesthetic is very subjective, and there are no absolute criteria as to what this should be.
  • Aware is thus connected to feelings of regret for things losing their beauty, and paradoxically finding beauty in their opposite. Moreover, anything can ultimately be appreciated as beautiful in Japan, and what is beautiful depends upon people’s subjective point of view.
  • ma is an empty space full of meaning, which is fundamental to the Japanese arts and is present in many fields, including painting, architecture, music, and literature.
  • The Japanese have long treated silence as a kind of virtue similar to “truthfulness.” The words haragei and ishin denshin symbolize Japanese attitudes toward human interactions in this regard. The former means implicit mutual understanding; the latter suggests that people can communicate with each other through telepathy. In short, what is important and what is true in Japan will often exist in silence, not in verbal expression.
  • uchi-soto, or inner and outer duality. Lebra (1987, p. 345) provides an explanation: [The Japanese] believe that the truth lies only in the inner realm as symbolically located in the heart or belly. Components of the outer self, such as face, mouth, spoken words, are in contrast, associated with cognitive and moral falsity. Truthfulness, sincerity, straightforwardness, or reliability are allied to reticence. Thus a man of few words is trusted more than a man of many words.
  • Zen training is designed to teach that truth cannot be described verbally, but can exist only in silence. Traditional Japanese arts and the spirit of dō (the “way” or “path”) reflect this characteristic silence.
  • Otoko-masari means a woman who is superior to men physically, spiritually, and intellectually. However, despite this literal meaning of “a woman who exceeds men,” it often sounds negative in Japanese because it carries a connotation of lacking femininity, and such women are usually disliked.
  • Zen emphasizes that all human beings originally possess the Buddha-nature within themselves and need only the actual experience of it to achieve enlightenment (satori). This is a state that is seen as a liberation from man’s intellectual nature, from the burden of fixed ideas and feelings about reality: “Zen always aims at grasping the central fact of life, which can never be brought to the dissecting table of the intellect” (Suzuki, 1964,
  • For the Zen master, the best way to express one’s deepest experiences is by the use of paradoxes that transcend opposites (e.g., “Where there is nothing, there is all” or “To die the great death is to gain the great life”). These sayings illustrate two irreducible Zen dilemmas—the inexpressibility of truth in words, and that “opposites are relational and so fundamentally harmonious” (Watts, 1957, p. 175).
  • In all forms of activity, Zen emphasizes the importance of acting naturally, gracefully, and spontaneously in whatever task one is performing, an attitude that has greatly influenced all forms of cultural expression in Japan.
  • moves must be repeated thousands of times and perfected before new techniques may be learned. The purpose of such discipline is “not only to learn new skills but also to build good character and a sense of harmony in the disciple” (Niki et al., 1993, p.
  • Common expressions in Japanese reflect these steps: kata ni hairu (follow the form), kata ni jukutatsu suru (perfect the form), and kata kara nukeru (go beyond the form).
  • All practice takes place in an atmosphere of quietude, obedience, and respect, mirroring the absolute obedience and respect of the master-student relationship.
  • Japanese mothers, who “apparently do not make explicit demands on their children and do not enforce rules when children resist. Yet, diverse accounts suggest that Japanese children strongly internalize parental, group, and institutional values”
  • Sen no Rikyu transformed the tea ceremony in the sixteenth century with an aesthetic principle known as wabi, or the contrast of refinement, simplicity, and rusticity. He advocated the use of plain, everyday Japanese utensils rather than those imported from China in the tea ceremony. Proportions and sizes were carefully chosen to harmonize perfectly with the small tearooms. Not only the utensils but the styles of the buildings and tea gardens, the order and etiquette of the ceremony were designed to be in accord with an atmosphere in which the goal was to perfect one’s existence without self-indulgence. Thus, the ideas of simplicity, perfection, discipline, and harmony with nature, which are central to the Zen way of life, are also reflected in sadō.
  • Reischauer (1988, p. 200) concurs: The Japanese have always seemed to lean more toward intuition than reason, to subtlety and sensitivity in expression rather than to clarity of analysis, to pragmatism rather than to theory, and to organizational skills rather than to great intellectual concepts. They have never set much store by clarity of verbal analysis and originality of thought. They put great trust in nonverbal understanding and look on oral or written skills and on sharp and clever reasoning as essentially shallow and possibly misleading. They value in their literature not clear analysis, but artistic suggestiveness and emotional feeling. The French ideal of simplicity and absolute clarity in writing leaves them unsatisfied. They prefer complexity and indirection as coming closer to the truth.
  • Gambaru is a frequently used word in Japan, with the meaning of doing one’s best and hanging on.
  • a discussion on the subject, scholars, journalists, and graduate students from other countries who know the Japanese and Japanese culture well provided expressions that are close to gambaru in their mother tongues, such as a ushalten, beharren, and beharrung in German; tiens bon in French; a guante in Spanish; and chā yo in Chinese.
  • The characteristics most often associated with the traditional Japanese arts are keishikika (formalization), kanzen shugi (the beauty of complete perfection), seishin shūyō (mental discipline), and tōitsu (integration and rapport with the skill). The steps that are followed are as follows: The establishment and formalization of the pattern or form (kata): every action becomes rule-bound (keishikika) The constant repetition of the pattern or form (hampuku) Mastering the pattern or form, as well as the classification of ability en route to mastery, resulting in licensing and grades (kyū and dan) Perfecting the pattern or form (kanzen shugi): the beauty of perfection Going beyond the pattern or form, becoming one with it (tōitsu)
  • There are some expressions that are often used in America but seldom in Japan, such as “take it easy.” Americans say to a person who is busy working, “take it easy” or “don’t work too hard”; in contrast, the Japanese say “gambatte ” (or work hard) as a sign of encouragement. Americans, of course, also think that it necessary to be diligent, but as the proverb says, “all work and no play makes Jack a dull boy,” suggesting that working too hard is not good for you.
  • giri involves caring for others from whom one has received a debt of gratitude and a determination to realize their happiness, sometimes even by self-sacrificing. (Gillespie & Sugiura, 1996, p. 150)
  • Giri can perhaps best be understood as a constellation of related meanings, the most important of which are as follows: (1) moral principles or duty, (2) rules one has to obey in social relationships, and (3) behavior one is obliged to follow or that must be done against one’s will (Matsumura, 1988, p.
  • the cost of ochūgen and oseibo gifts is almost equivalent to the cost of justice in the USA, meaning that the cost of keeping harmony in human relations and that of mediating legal disputes is almost the same.
  • A Japanese dictionary (cited in Matsumoto, 1988, p. 20) describes haragei as follows: (1) the verbal or physical action one employs to influence others by the potency of rich experience and boldness, and (2) the act of dealing with people or situations through ritual formalities and accumulated experience. In other words, haragei is a way of exchanging feelings and thoughts in an implicit way among the Japanese.
  • Honne and tatemae are another related set of concepts that are linked to haragei. “These terms are often used as contrasting yet complementary parts of a whole, honne being related to the private, true self, and tatemae typifying the public persona and behavior. Honne then has to do with real intentions and sincere feelings, while tatemae conveys the face the world sees” (Matsumoto, 1988, p. 18). People in Japan are implicitly taught from a young age how to use honne and tatemae properly, and these concepts are important in maintaining face and not hurting the feelings of others; therefore, what a speaker says is not always what he or she really means. intentions and sincere feelings, while tatemae conveys the face the world sees” (Matsumoto, 1988, p. 18). People in Japan are implicitly taught from a young age how to use honne and tatemae properly,
  • those who cannot use these concepts effectively are not considered to be good communicators, because they may hurt others or make a conversation unpleasant by revealing honne at the wrong moment.
  • In high-context cultures most of the information lies either in the setting or people who are part of the interaction. Very little information is actually contained in a verbal message. In low-context cultures, however, the verbal message contains most of the information and very little is embedded in the context or the participants. (Samovar & Porter, 1995,
  • Personal space in Japanese human relationships can be symbolized by two words that describe both physical and psychological distance between individuals: hedataru and najimu. Hedataru means “to separate one thing from another, to set them apart,” and it is also used in human relationships with such nuances as “to estrange, alienate, come between, or cause a rupture between friends.” A relationship between two persons without hedatari means they are close. On the other hand, najimu means “to become attached to, become familiar with, or used to.” For instance, if one says that students “najimu ” their teacher, it means that they become attached to and have close feelings for the teacher.
  • Relationships are established through hedataru and then deepened by najimu, and in this process, three stages are considered important: maintaining hedatari (the noun form of the verb hedataru), moving through hedatari, and deepening friendship by najimu.
  • Underlying these movements are the Japanese values of restraint and self-control. In Japan, relationships are not built by insisting strongly on one’s own point of view but require time, a reserved attitude, and patience.
  • In the seventh century, Prince Shotoku, who was a nephew of Emperor Suiko, occupied the regency and discovered a way of permitting Buddhism and the emperor system to coexist, along with another belief system adopted from China, Confucianism. He stated that “Shinto is the trunk, Buddhism is the branches, and Confucianism is the leaves” (Sakaiya, 1991, p. 140). By following this approach, the Japanese were able to accept these new religions and philosophies, and the cultural values and advanced techniques that came with them, in such a way that they were able to reconcile their theoretical contradictions.
  • Iitoko-dori, then, refers specifically to this process of accepting convenient parts of different, and sometimes contradictory, religious value systems, and this practice has long been widespread in Japan. In modern times, Sakaiya (ibid., p. 144) notes that the number of Japanese people who do not admit to following some form of iitoko-dori is only about 0.5 percent of the population.
  • However, the process of iitoko-dori, which has given rise to relative rather than absolute ethical value systems, has also resulted in serious negative consequences. For example, many Japanese students will not oppose bullies and stop them from hurting weaker students.
  • In other words, in Japan, even if people know that something is wrong, it is sometimes difficult for them to defend their principles, because rather than being absolute, these principles are relative and are easily modified, depending on the situation and the demands of the larger group to which people belong.
  • Both Chinese and Korean have the characters that make up gambaru, but they do not have expressions that possess the same nuances. This suggests that gambaru is an expression that is unique to Japan and expresses certain qualities of the Japanese character.
  • It is also interesting to note the differences in this concept of “good-child identity” between Japan and America. As far as expectations for children’s mental development are concerned, Japanese mothers tend to place emphasis on manners, while with American mothers the stress is on linguistic self-expression.
  • In other words, the ideal of the “good child” in Japan is that he or she should not be self-assertive in terms of rules for living together in society, while American “good children” should have their own opinions and be able to stand by themselves.
  • In other words, Japanese mothers tend to refer to people’s feelings, or even to those of inanimate objects, to modify their child’s behavior, and this establishes the basis for making judgments for the child: Children who are taught that the reason for poor behavior has something to do with other people’s feelings tend to place their basis for judgments, or for their behavior, on the possibility of hurting others.
  • As a result, there is a constant emphasis on other people’s feelings in Japan, and parents try to teach their children from a very early age to be sensitive to this information. In Japan, people are expected to consider others first and foremost, and this is a prerequisite for proper behavior in society. It
  • A senior or an elder is called a sempai; one who is younger or subordinate is a kōhai. This sempai-kōhai dichotomy exists in virtually all Japanese corporate, educational, and governmental organizations.
  • The Japanese language has one of the most complicated honorific (keigo) systems in the world. There are basically three types of keigo: teineigo (polite speech), sonkeigo (honorific speech), and kenjōgo (humble speech). Teineigo is used in both
  • Although keigo is used to address superiors or those whom one deeply respects, it is also widely employed in talking to people one does not know well, or who are simply older than oneself. Moreover, it is common for company employees to use keigo in addressing their bosses, whether or not they feel any respect for the other on a personal level. As
  • Recently, it has been said that the younger generation cannot use keigo properly. In fact, children do not use it in addressing their parents at home, nor do students in addressing their teachers in modern Japan. Furthermore, humble forms seem to be disappearing in colloquial language and can be found today only in formal speeches, greetings, and letters.
  • Dictionaries usually suggest kenkyo as the equivalent of modesty. One Japanese dictionary states that kenkyo means sunao to hikaeme. Hikaeme gives the impression of being reserved, and sunao has a variety of meanings, including “gentle, mild, meek, obedient, submissive, docile, compliant, yielding,” and so on. Many of these adjectives in English connote a weak character, but in Japanese sunao is always seen as a compliment. Teachers often describe good students as sunaona iiko. This means that they are quiet, listen to what the teacher says, and ask no questions in class.
  • The Japanese ideal of the perfect human being is illustrated in these folktales, and this is generally a person who has a very strong will.
  • Dentsu Institute reported that only 8 percent of Japanese people surveyed said that they would maintain their own opinion even if it meant falling out with others, which was the lowest percentage in all Asian countries (“Dour and dark outlook,” 2001, p. 19).
Javier E

Where Environmentalists Went Wrong - Yascha Mounk - 0 views

  • what is wrong with a particular kind of increasingly common environmental regulation: one that is short on impact but big on virtue signaling.
  • Some American states have banned cafés and restaurants from offering their customers single-use plastic straws.Many jurisdictions around the world now require grocery stores to charge their customers for plastic bags.The EU has phased out incandescent light bulbs.The EU has also banned plastic bottles with removable caps, leading to the introduction of bottles that don’t always properly close once they have been opened.Though not yet implemented, some prominent organizations and activists have called for gas stoves to be banned.
  • These seemingly disparate examples share an important commonality: They are a form of policy intervention that achieves small improvements for the environment at the cost of a salient deterioration in quality of life or a large loss of political goodwill. For that reason, each of these interventions is likely to backfire.
  • ...48 more annotations...
  • policy makers and environmentalists need to get smart about political capital: how to build it and, most importantly, how to avoid wasting it.
  • Environmentalist policies don’t just need to be well-intentioned or feel virtuous; they need to be effective in accomplishing their stated goals.
  • Cumulatively, they risk giving citizens the impression that those in charge care more about forcing them to change their lifestyle than about solving real problems
  • If we want to win the fight against climate change, we need to get serious about achieving the biggest possible environmental impact for the smallest possible price in quality of life and political goodwill
  • Low-impact policies that demand small, if frequent and highly salient, sacrifices feel virtuous. But they deplete a disproportionate amount of political capital.
  • It’s time for a new paradigm. Call it “effective environmentalism.”
  • This is driven by a deeper sense, widespread in the environmental movement, that the fight against climate change is coterminous with the fight to remake the world from scratch. To many, social ills like racism, sexism and even capitalism itself are facets of one interrelated system of oppression. A victory against any one facet requires a victory against all.
  • Naomi Klein’s bestselling This Changes Everything is a classic of the genre. Tellingly, the first change she admonishes her readers to make concerns their lifestyle: “For us high consumers,” she writes, preventing the dire future that awaits humanity requires “changing how we live.”
  • Even more tellingly, Klein maintains that making these changes will require nothing short of the abolition of capitalism. To her, the right way to understand this historical moment is as “a battle between capitalism and the planet.”
  • it turns out that you can’t scare and shame people into taking action on climate change. If anything, this political moment seems to be characterized by a mix of apathy and backlash. In the United States, a recent poll of young voters reveals that only 6 percent of them consider “environmental issues” their top priority, the same number who say their top priority is immigration (economic issues easily eclipse both).
  • As recently as four years ago, Germany’s Green Party was polling around 25 percent of the vote, and looked likely to lead a federal government for the first time in the country’s history. Now, its support is down to about ten percent, with the decline among young voters especially dramatic. Opinions about the party in the electorate give a clue about the source of its troubles:
  • In a recent exit poll conducted during the state election in Brandenburg, 71 percent of voters complained that the party “has insufficient concern with the economy and creating jobs.” 66 percent complained that the party “wants to tell us how to live.”
  • The environment, like most areas of public policy, is the realm of painful trade-offs. Efforts to fix the climate crisis will involve a significant degree of expense and inconvenience. For both moral and strategic reasons, the goal of environmental regulation should therefore be to accomplish important goals while minimizing these costs insofar as possible
  • effective environmentalism consists in actions or policies which maximize positive impact on the environment while minimizing both the price for humans’ quality of life and the depletion of a collective willingness to adopt other impactful measures.
  • most of the time, such a definition is less helpful than the spirit which animates it. And that spirit is best captured in a more informal register. So rather than focusing on the definition, effective environmentalists should evaluate any proposed action, policy or regulation by asking themselves three questions:
  • . How big a positive impact (if any) will the proposed action have?
  • In politics, it’s easy to obsess over whatever happens to be salient. If some question touches a cultural nerve, or has given rise to major political battles in the past, its stakes can come to seem existential—even if not much hinges on it in the real world. This is part of what makes it so tempting to obsess about such things as banning plastic straws or detachable bottle caps (which have little impact) rather than tax incentives or cap-and-trade schemes (which would have a vastly larger impact)
  • 2. To what extent will the proposed action lead to a deterioration in quality of life?
  • this also gives them reason to care about the negative consequences that environmental policies may have for human welfare. So the extent of the trade-off needs to be a key consideration. The bigger an adverse impact a particular policy has on people’s quality of life, the more skeptical we should be about implementing it.
  • For the most part, people who worry about climate change and other forms of environmental degradation are motivated by a concern about human welfare. They worry about the negative consequences that runaway climate change would have for humankind
  • 3. To what extent will the proposed action lead to backlash?
  • Political capital is limited. In most democracies, a clear majority of the population now cares about climate change to some extent. But this genuine concern competes with, and tends to be eclipsed by, voters’ concern about economic priorities like the availability of good jobs
  • This context makes it all the more important for voters to feel that governments and environmental groups are focusing on impactful steps that leave them in charge of decisions about their own lives; otherwise, support for any environmental policy is likely to polarize along partisan lines, or even to crater across the board. 
  • When I coined the term “effective environmentalism,” I was of course inspired by an earlier movement: “effective altruism.”
  • for all of the problems with effective altruism, the original insight on which it is built is hard to contest. People spend billions of dollars on charitable contributions every year. Much of that money goes to building new gyms at fancy universities or upgrading the local cat shelter. Wouldn’t it be better to direct donors’ altruistic instincts to more impactful endeavors, potentially saving the lives of thousands of people?
  • Something similar holds true for the environmental movement. Many activists are more focused on interventions that feel virtuous than on ones that will make a real difference. As a result, much of the movement has proven ineffective
  • Effective altruists pride themselves in adopting principles and mental heuristics that are supposed to help them assess what to do in a more rational way. These include not judging an idea based on who says it; reserving judgment about an idea until you’ve analyzed both its benefits and its costs; paying attention to the relative weight of different priorities; and being skeptical about forms of symbolic politics that don’t lead to real change
  • these norms make a lot of sense, and have relevance for environmentalists focused on having real impact.
  • So, to figure out what policies can make the biggest difference in the fight against climate change, and actually win the political capital to put these into practice, effective environmentalists should:
  • Assess Policies on the Basis of their Impact, not Their Perceived Purity:
  • Prioritize Actions that Solve the Biggest Problems:
  • It would be a mistake to subsume all environmental concerns to the fight against climate change. People have reasons to care about living in a clean environment or alleviating animal suffering even if it does not help to protect us from the threat posed by climate change
  • There are a variety of environmental goals, and it makes sense to recognize this plurality of goods. And yet, those who care about environmental goals need to have a clear sense of relative priorities. Some goals are more important than others
  • effective environmentalists should unflinchingly give precedence to the most important goals.
  • Be Willing to Build Cross-Ideological Coalitions:
  • Activists increasingly pride themselves in being “intersectional.” Since they believe that various forms of oppression intersect, people who want to participate in the fight against one form of injustice must also get on board with a set of progressive assumptions about how to combat other forms of perceived injustice
  • This can raise the entrance ticket for anyone who wants to get involved in fighting for an environmental cause; distract major environmental organizations from fighting for their stated goals; and make powerful players unwilling to forge tactical coalitions with partners whose broader worldview they disdai
  • Effective environmentalists should reject this purist instinct, making common cause with anyone who favors impactful action irrespective of the views they may hold about unrelated issues.
  • Put People in Charge of Their Own Lives:
  • Effective environmentalists should fight to transition as much of the economy as possible to forms of energy that do not emit carbon. This will require broad political support and, yes, real financial trade-offs
  • effective environmentalists should avoid overly intrusive regulations about how people then go about using that energy. If consumers are willing to pay an elevated price for the pleasure of sitting on an outside terrace in the late fall, it shouldn’t be for the government or for environmental activists to decide that a different use of energy is more morally righteous.
  • No-Bullshit Environmentalism
  • For the last decades, the environmentalist movement has tried its hand at fear-mongering.
  • this kind of rhetoric is factually misleading and politically disastrous.
  • This is why I favor a different approach. This approach centers the serious risks posed by climate change. But it also insists that humans are capable of meeting this moment with a mix of collective action and ingenuity.
  • With the right investments and regulations, we can reduce carbon emissions and mitigate the impact of a warming planet. And while this transition will exact considerable costs, it need not make us poor or require us to abstain from putting plentiful energy to its many miraculous uses.
  • at a start, the mix of policies advocated by effective environmentalists is likely to include: a commitment to creating energy abundance while transitioning towards a low-carbon economy; significant investment in both renewable and nuclear energy; regulatory action to raise the price of fossil-fuels; the adoption of genetically-modified crops that can withstand a changing climate; public and private investment to mitigate the effects of the warming that is already underway; the development and adoption of new technologies that can capture carbon; and a willingness to do serious research on speculative ideas, such as marine cloud brightening, that have the potential to avert worst-case outcomes in the case of a climate emergency. 
  • In life as in economics, trade-offs are real. But in the context of a growing economy, we should be able to bear those costs without suffering any overall reduction in human affluence or well-being. If we adopt the principles of effective environmentalism and take energetic action, our future shines bright.
Javier E

Why Is Every Young Person in America Watching 'The Sopranos'? - The New York Times - 0 views

  • Biederman argued that the show is, at its heart, about the bathetic nature of decline. “Decline not as a romantic, singular, aesthetically breathtaking act of destruction,” he said, but as a humiliating, slow-motion slide down a hill into a puddle of filth. “You don’t flee a burning Rome with your beautiful beloved in your arms, barely escaping a murderous horde of barbarians; you sit down for 18 hours a day, enjoy fewer things than you used to, and take on the worst qualities of your parents while you watch your kids take on the worst qualities of you.”
  • The show’s depiction of contemporary America as relentlessly banal and hollow is plainly at the core of the current interest in the show, which coincides with an era of crisis across just about every major institution in American life.
  • “The Sopranos” has a persistent focus on the spiritual and moral vacuum at the center of this country, and is oddly prescient about its coming troubles: the opioid epidemic, the crisis of meritocracy, teenage depression and suicide, fights over the meaning of American history.
  • ...31 more annotations...
  • that’s what I felt back in those days,” he said, “that everything was for sale — it was all about distraction, it didn’t seem serious. It all felt foolish and headed for a crash.”
  • Younger viewers do not have to fear Chase’s wrath, because they are not so obviously its object. They are also able to watch the show for hours on end, which makes the subtext and themes more apparent. Perhaps all of this has offered clarity that was not possible when the show aired. Perhaps it is easier now to see exactly who — or what — Chase was angry at.
  • it is easily one of the most written-about TV shows in the medium’s short history. But more than the shows that have emerged in its wake, which are subjected to close readings and recaps in nearly every major publication, “The Sopranos” has a novelistic quality that actually withstands this level of scrutiny. It’s not uncommon to hear from people who have watched the series several times, or who do so on a routine basis — people who say it reveals new charms at different points in life
  • Perhaps the greatest mystery of all, looking back on “The Sopranos” all these years later, is this: What was Chase seeing in the mid-’90s — a period when the United States’ chief geopolitical foe was Serbia, when the line-item veto and school uniforms were front-page news, when “Macarena” topped the charts — that compelled him to make a show that was so thoroughly pessimistic about this country?
  • “I don’t think I felt like it was a good time,” he told me. He is 76 now, and speaks deliberately and thoughtfully. “I felt that things were going downhill.” He’d become convinced America was, as Neil Postman’s 1985 polemic put it, “Amusing Ourselves to Death,” not an easy thing for a journeyman TV writer to accept.
  • “There was nothing but crap out there. Crap in every sense. I was beginning to feel that people’s predictions about the dumbing-down of society had happened and were happening, and I started to see everything getting tawdry and cheap.”
  • Expanded access to credit had cut into what mobsters call the shylock business; there’s no need to go to a loan shark when the payday lender will offer you similarly competitive rates. Gambling was legalized in many states and flourishes on many reservations; nearly every state in the Union has a lottery, which decimated the numbers racket. Italian American neighborhoods have emptied out — as Jacobs writes, “radically diminishing the pool of tough teenagers with Cosa Nostra potential”; this is dramatized brilliantly in the final episode of the series, when a mobster from a New York family hurries through Little Italy on an important phone call and, when the call ends, looks around to see he’s wandered into an encroaching and vibrant Chinatown. And, Jacobs notes, union membership has been decimated. “In the mid-1950s, about 35 percent of U.S. workers belonged to a union,” he writes. “In recent years, only 6.5 percent of private-sector workers have been union members.”
  • I was about to change the subject when he hit on something. “Have you noticed — or maybe you haven’t noticed — how nobody does what they say they’re going to do?” he said, suddenly animated. “If your sink gets jammed up, and a guy says he’s going to be out there at 5:30 — no. Very few people do what they say they’re going to do. There is a decline in goods and services that is enormous.”
  • Chase told me the real joke of the show was not “What if a mobster went to therapy?” The comedic engine, for him, was this: What if things had become so selfish and narcissistic in America that even the mob couldn’t take it? “That was the whole thing,” he said. “America was so off the rails that everything that the Mafia had done was nothing compared to what was going on around them.”
  • In “The Mafia: A Cultural History,” Roberto M. Dainotto, a professor of literature at Duke, writes that one thing our cinematic Mafiosi have that we admire, against our better judgment, is access to structures of meaning outside of market forces: the church, family, honor. The Mafia movie often pits these traditional values against the corrosive and homogenizing effects of American life.
  • What “The Sopranos” shows us, Dainotto argues, is what happens when all that ballast is gone, and the Mafia is revealed to be as ignoble as anything else. “Life is what it is,” he writes, “and repeats as such.”
  • The show puts all this American social and cultural rot in front of characters wholly incapable of articulating it, if they even notice it.
  • What is, for me, one of the show’s most memorable scenes has no dialogue at all. Tony and his crew have just returned from a business trip to Italy, during which they were delighted with the Old Country but also confronted with the degree of their alienation from their own heritage. They’re off the plane, and in a car traveling through Essex County. As the camera pans by the detritus of their disenchanted world — overpasses, warehouses — Tony, Paulie and Christopher are seeing their home with fresh eyes, and maybe wondering if their ancestors made a bad trade or if, somewhere along the line, something has gone horribly wrong. But we don’t know: For once, these arrogant, stupid and loquacious men are completely silent.
  • Around the time “The Sopranos” premiered, the N.Y.U. Law professor James B. Jacobs wrote a paper, along with a student, arguing that the Mafia, though weakened by decades of prosecutions, could come roaring back. By 2019, though, he had published a new paper called “The Rise and Fall of Organized Crime in the United States,” declaring the Mafia all but finished. “The world in which the Cosa Nostra became powerful is largely gone,” he wrote. And he cites a litany of factors that aided its collapse, a mix of technological advances, deregulation and financialization — many of the same forces that have created the stratified economy of today.
  • In his first therapy session with Dr. Melfi, Tony tries to explain why he thinks he has panic attacks, why he suffers from stress. “The morning of the day I got sick, I’d been thinking: It’s good to be in something from the ground floor,” he says. “I came too late for that, I know. But lately, I’m getting the feeling that I came in at the end. The best is over.” Melfi tells him that many Americans feel that way. Tony presses on: “I think about my father: He never reached the heights like me, but in a lot of ways he had it better. He had his people, they had their standards, they had their pride. Today, what do we got?”
  • You can see this world — one in which no one can be squeezed because everyone is being squeezed — starting to take shape from the very beginning of the show. In the pilot, Tony is fending off competition from a new waste-hauling business undercutting his company’s extortionate fees, and trying to figure out how he can get a piece of the similarly extortionate costs his health insurer paid for his M.R.I. — a procedure he had because the stress in his life had given him a panic attack.
  • The bien-pensant line on Tony remains that he’s a sociopath, and only used therapy to become a better criminal. This is an idea spoon-fed to the viewer in the final episodes by a contrite Dr. Melfi, in a show that spoon-feeds almost nothing to the viewer. Melfi herself might call this a coping mechanism to avoid the messier reality, which is that Tony lives in an immoral world nestled within another immoral world, both of which have only grown more chaotic because of forces outside his control.
  • Because of this, you can see how he reasons himself into more and more heinous crimes, justifying each and every one of them to himself. Perhaps to you too — at least, up to a point. That sympathy for Tony led contemporaneous critics to ask if people were watching the show in the wrong way, or if our enjoyment pointed to a deficiency of the heart.
  • t is this quality of Tony’s — this combination of privilege and self-loathing — that I suspect resonates with a younger generation, whether we want to admit it or not. He’s not so different from us, after all. He has an anxiety disorder. He goes to therapy and takes S.S.R.I.s, but never really improves — not for long, anyway. He has a mild case of impostor syndrome, having skipped some key steps to becoming boss, and he knows that people who hold it against him are sort of right. He’s still proud of his accomplishments in high school. He does psychedelics in the desert, and they change his perspective on things. He often repeats stuff he half-remembers someone smarter than him saying. He’s arguably in an open marriage with Carmela, if a rather lopsided one. He liked listening to “Don’t Stop Believin’” in 2007. He’s impulsive and selfish and does not go to church, though he does seem open to vaguer notions of spirituality. He wishes his career provided him with meaning, but once he had the career, he discovered that someone had pulled the rug out at some point, and an institution that had been a lodestar to him for his whole life was revealed to be a means of making money and nothing more. Does this sound at all familiar to you?
  • Like many young people, Tony is a world-historically spoiled man who is nevertheless cursed, thanks to timing, to live out the end of an enterprise he knows on some level to be immoral.
  • It gives him panic attacks, but he’s powerless to find a way out. Thus trapped — and depressed — it’s not so hard for him to allow himself a few passes, to refuse to become better because the world is so rotten anyway.
  • Tony’s predicament was once his to suffer alone, but history has unfolded in such a way as to render his condition nearly universal.
  • That the people in power truly had insulated themselves in a fantasy environment — not just in the realm of foreign policy, but also, more concretely, in the endless faux-bucolic subdivisions that would crater the economy. We were living in a sort of irreality, one whose totality would humiliate and delegitimize nearly every important institution in American life when it ended, leaving — of all people — the Meadows and A.J.s of the world to make sense of things.
  • if people still see a monster in Tony, then the monster is themselves: a twisted reflection of a generation whose awakening to the structures that control them came in tandem with a growing aversion to personal accountability in the face of these systems.
  • Whether that’s true or not, it offers us all permission to become little Tonys, lamenting the sad state of affairs while doing almost exactly nothing to improve ourselves, or anything at all.
  • This tendency is perhaps most pronounced online, where we are all in therapy all day, and where you can find median generational opinions perfectly priced by the marketplace of ideas — where we bemoan the wrongs of the world and tell ourselves that we can continue being who we are, and enjoy the comforts we’ve grown accustomed to.
  • In the show’s finale, as the extended Soprano family gathers to mourn the death of Bobby Baccalieri, we find Paulie Walnuts stuck at the kids’ table, where A.J., newly politically awakened, charges into a rant. You people are screwed, he says. “You’re living in a dream.” Bush let Al Qaeda escape, he tells them, and then made us invade some other country? Someone at the table tells him that if he really cares, he should join up. A.J. responds: “It’s more noble than watching these jackoff fantasies on TV of how we’re kicking their ass. It’s like: America.” Again, he’s interrupted: What in the world does he mean? He explains: “This is still where people come to make it. It’s a beautiful idea. And then what do they get? Bling? And come-ons for [expletive] they don’t need and can’t afford?”
  • However inartfully, A.J. was gesturing at something that would have been hard for someone his age to see at the time, which is that the ’00s were a sort of fever dream, a tragic farce built on cheap money and propaganda.
  • The notion that individual action might help us avoid any coming or ongoing crises is now seen as hopelessly naïve, the stuff of Obama-era liberalism.
  • . The “leftist ‘Sopranos’ fan” is now such a well-known type that it is rounding the corner to being an object of scorn and mockery online.
  • One oddity that can’t be ignored in this “Sopranos” resurgence is that, somewhat atypically for a TV fandom, there is an openly left-wing subcurrent within it
Javier E

On Grand Strategy (John Lewis Gaddis) - 0 views

  • minds. Ordinary experience, he pointed out, is filled with “ends equally ultimate . . . , the realization of some of which must inevitably involve the sacrifice of others.” The choices facing us are less often between stark alternatives—good versus evil, for instance—than between good things we can’t have simultaneously. “One can save one’s soul, or one can found or maintain or serve a great and glorious State,” Berlin wrote, “but not always both at once.”
  • We resolve these dilemmas by stretching them over time. We seek certain things now, put off others until later, and regard still others as unattainable. We select what fits where, and then decide which we can achieve when. The process can be difficult: Berlin emphasized the “necessity and agony of choice.” But if such choices were to disappear, he added, so too would “the freedom to choose,” and hence liberty itself.24
  • only narratives can show dilemmas across time. It’s not enough to display choices like slivers on a microscope slide. We need to see change happen, and we can do that only by reconstituting the past as histories, biographies, poems, plays, novels, or films. The best of these sharpen and shade simultaneously: they compress what’s happening in order to clarify, even as they blur, the line between instruction and entertainment. They are, in short, dramatizations. And a fundamental requirement of these is never to bore.
  • ...74 more annotations...
  • When Thaddeus Stevens (Tommy Lee Jones) asks the president how he can reconcile so noble an aim with such malodorous methods, Lincoln recalls what his youthful years as a surveyor taught him: [A] compass . . . [will] point you true north from where you’re standing, but it’s got no advice about the swamps and deserts and chasms
  • chasms that you’ll encounter along the way. If in pursuit of your destination, you plunge ahead, heedless of obstacles, and achieve nothing more than to sink in a swamp . . . , [then] what’s the use of knowing true north?
  • The real Lincoln, as far as I know, never said any of this, and the real Berlin, sadly, never got to see Spielberg’s film. But Tony Kushner’s screenplay shows Fitzgerald’s linkage of intelligence, opposing ideas, and the ability to function: Lincoln keeps long-term aspirations and immediate necessities in mind at the same time. It reconciles Berlin’s foxes and hedgehogs with his insistence on the inevitability—and the unpredictability—of choice:
  • Whether we approach reality from the top down or the bottom up, Tolstoy seems to be saying, an infinite number of possibilities exist at an indeterminate number of levels, all simultaneously. Some are predictable, most aren’t, and only dramatization—free from the scholar’s enslavement to theory and archives—can begin to represent them.
  • what is “training,” as Clausewitz understands it? It’s being able to draw upon principles extending across time and space, so that you’ll have a sense of what’s worked before and what hasn’t. You then apply these to the situation at hand: that’s the role of scale. The result is a plan, informed by the past, linked to the present, for achieving some future goal.
  • I think he’s describing here an ecological sensitivity that equally respects time, space, and scale. Xerxes never had it, despite Artabanus’ efforts. Tolstoy approximated it, if only in a novel. But Lincoln—who lacked an Artabanus and who didn’t live to read War and Peace—seems somehow to have achieved it, by way of a common sense that’s uncommon among great leaders.
  • It’s worth remembering also that Lincoln—and Shakespeare—had a lifetime to become who they were. Young people today don’t, because society so sharply segregates general education, professional training, ascent within an organization, responsibility for it, and then retirement.
  • This worsens a problem Henry Kissinger identified long ago: that the “intellectual capital” leaders accumulate prior to reaching the top is all they’ll be able to draw on while at the top.37 There’s less time now than Lincoln had to learn anything new.
  • A gap has opened between the study of history and the construction of theory, both of which are needed if ends are to be aligned with means. Historians, knowing that their field rewards specialized research, tend to avoid the generalizations
  • Theorists, keen to be seen as social “scientists,” seek “reproducibility” in results: that replaces complexity with simplicity in the pursuit of predictability. Both communities neglect relationships between the general and the particular—between universal and local knowledge—that nurture strategic thinking.
  • concrete events in time and space—the sum of the actual experience of actual men and women in their relation to one another and to an actual three-dimensional, empirically experienced, physical environment—this alone contained the truth,
  • Collaboration, in theory, could have secured the sea and the land from all future dangers. That would have required, though, the extension of trust, a quality with strikingly shallow roots in the character of all Greeks.
  • The only solution then is to improvise, but this is not just making it up as you go along. Maybe you’ll stick to the plan, maybe you’ll modify it, maybe you’ll scrap it altogether. Like Lincoln, though, you’ll know your compass heading, whatever the unknowns that lie between you and your destination. You’ll have in your mind a range of options for dealing with these, based—as if from Machiavelli—upon hard-won lessons from those who’ve gone before.
  • The past and future are no more equivalent, in Thucydides, than are capabilities and aspirations in strategy—they are, however, connected.
  • The past we can know only from imperfect sources, including our own memories. The future we can’t know, other than that it will originate in the past but then depart from it. Thucydides’ distinction between resemblance and reflection—between patterns surviving across time and repetitions degraded by time—aligns the asymmetry, for it suggests that the past prepares us for the future only when, however imperfectly, it transfers. Just as capabilities restrict aspirations to what circumstances will allow.
  • Insufficiency demands indirection, and that, Sun Tzu insists, requires maneuver: [W]hen capable, feign incapacity; when active, inactivity. When near, make it appear that you are far; when far away, that you are near. Offer an enemy a bait to lure him; feign disorder and strike him. . . . When he concentrates, prepare against him; where he is strong, avoid him. . . . Pretend inferiority and encourage his arrogance. . . . Keep him under a strain and wear him down. Opposites held in mind simultaneously, thus, are “the strategist’s keys to victory.”
  • it was Pericles who, more than anyone else, unleashed the Peloponnesian War—the unintended result of constructing a culture to support a strategy.
  • By the mid-450s Pericles, who agreed, had finished the walls around Athens and Piraeus, allowing total reliance on the sea in any future war. The new strategy made sense, but it made the Athenians, as Thucydides saw, a different people. Farmers, traditionally, had sustained Athens: their fields and vineyards supplied the city in peacetime, and their bodies filled the ranks of its infantry and cavalry when wars came. Now, though, their properties were expendable and their influence diminished.
  • If Athens were to rely upon the ardor of individuals, then it would have to inspire classes within the city and peoples throughout the empire—even as it retained the cohesiveness of its rival Sparta, still in many ways a small town.
  • Pericles used his “funeral oration,” delivered in Athens at the end of the Peloponnesian War’s first year, to explain what he hoped for. The dead had given their lives, he told the mourners, for the universality of Athenian distinctiveness: Athens imitated no one, but was a pattern for everyone. How, though, to reconcile these apparent opposites? Pericles’ solution was to connect scale, space, and time: Athenian culture would appeal to the city, the empire, and the ages.
  • The city had acquired its “friends,” Pericles acknowledged, by granting favors, “in order by continued kindness to keep the recipient in [its] debt; while the debtor [knows] that the return he makes will be a payment, not a free gift.” Nevertheless, the Athenians had provided these benefits “not from calculations of expediency, but in the confidence of liberality.” What he meant was that Athens would make its empire at once more powerful and more reassuring than that of any rival.
  • It could in this way project democracy across cultures because insecure states, fearing worse, would freely align with Athens.22 Self-interest would become comfort and then affinity.
  • The Athenians’ strategy of walling their cities, however, had reshaped their character, obliging them restlessly to roam the world. Because they had changed, they would have to change others—that’s what having an empire means—but how many, to what extent, and by what means? No one, not even Pericles, could easily say.
  • Equality, then, was the loop in Pericles’ logic. He saw both it and empire as admirable, but was slow to sense that encouraging one would diminish the other.
  • Like Lincoln, Pericles looked ahead to the ages. He even left them monuments and sent them messages. But he didn’t leave behind a functional state: it would take well over two millennia for democracy again to become a model with mass appeal.
  • as Thucydides grimly observes, war “brings most men’s character to a level with their fortunes.”
  • “Island” strategies require steady nerves. You have to be able to watch smoke rise on horizons you once controlled without losing your own self-confidence, or shaking that of allies, or strengthening that of adversaries.
  • For the abstractions of strategy and the emotions of strategists can never be separated: they can only be balanced. The weight attached to each, however, will vary with circumstances. And the heat of emotions requires only an instant to melt abstractions drawn from years of cool reflection.
  • if credibility is always in doubt, then capabilities must become infinite or bluffs must become routine. Neither approach is sustainable: that’s why walls exist in the first place.
  • he encouraged his readers to seek “knowledge of the past as an aid to the understanding of the future, which in the course of human things must resemble if it does not reflect it.” For without some sense of the past the future can be only loneliness: amnesia is a solitary affliction.
  • But to know the past only in static terms—as moments frozen in time and space—would be almost as disabling, because we’re the progeny of progressions across time and space that shift from small scales to big ones and back again. We know these through narratives, whether historical or fictional or a combination of both.
  • No one can anticipate everything that might happen. Sensing possibilities, though, is better than having no sense at all of what to expect. Sun Tzu seeks sense—even common sense—by tethering principles, which are few, to practices, which are many.
  • Clausewitz’s concept of training, however, retains its relevance. It’s the best protection we have against strategies getting stupider as they become grander, a recurring problem in peace as well as war. It’s the only way to combine the apparent opposites of planning and improvisation: to teach the common sense that comes from knowing when to be a hedgehog and when a fox.
  • Victories must connect: otherwise they won’t lead anywhere. They can’t be foreseen, though, because they arise from unforeseen opportunities. Maneuvering, thus, requires planning, but also improvisation. Small triumphs in a single arena set up larger ones elsewhere, allowing weaker contenders to become stronger.
  • The actions of man, Kennan concluded, “are governed not so much by what he intellectually believes as by what he vividly realizes.”
  • Nor is it clear, even now, whether Christianity caused Rome’s “fall”—as Gibbon believed—or—as the legacies of Augustus suggest—secured Rome’s institutional immortalities. These opposites have shaped “western” civilization ever since. Not least by giving rise to two truly grand strategies, parallel in their purposes but devised a thousand years apart
  • Augustine shows that reality always falls short of the ideal: one can strive toward it, but never expect to achieve it. Seeking, therefore, is the best man can manage in a fallen world, and what he seeks is his choice. Nevertheless, not all ends are legitimate; not all means are appropriate. Augustine seeks, therefore, to guide choice by respecting choice. He does this through an appeal to reason: one might even say to common sense.
  • A peaceful faith—the only source of justice for Christians—can’t flourish without protection, whether through toleration, as in pre-Constantine Rome, or by formal edict, as afterward.20 The City of God is a fragile structure within the sinful City of Man. It’s this that leads Christians to entrust authority to selected sinners—we call it “politics”—and Augustine, for all his piety, is a political philosopher.
  • Augustine concluded that war, if necessary to save the state, could be a lesser evil than peace—and that the procedural prerequisites for necessity could be stated. Had provocation occurred? Had competent authority exhausted peaceful alternatives? Would the resort to violence be a means chosen, not an end in itself? Was the expenditure of force proportionate to its purposes, so that it wouldn’t destroy what it was meant to defend?
  • No one before Augustine, however, had set standards to be met by states in choosing war. This could be done only within an inclusionary monotheism, for only a God claiming universal authority could judge the souls of earthly rulers. And only Augustine, in his era, spoke so self-confidently for Him. The
  • Augustine’s great uncertainty was the status of souls in the City of Man, for only the fittest could hope to enter the City of God. Pre-Christian deities had rarely made such distinctions: the pagan afterlife was equally grim for heroes, scoundrels, and all in between.25 Not so, though, with the Christian God: behavior in life would make a huge difference in death. It was vital, then, to fight wars within rules. The stakes could hardly be higher.
  • Alignment, in turn, implies interdependence. Justice is unattainable in the absence of order, peace may require the fighting of wars, Caesar must be propitiated—perhaps even, like Constantine, converted—if man is to reach God. Each capability brings an aspiration within reach, much as Sun Tzu’s practices tether his principles, but what’s the nature of the tether? I think it’s proportionality: the means employed must be appropriate to—or at least not corrupt—the end envisaged. This, then, is Augustine’s tilt: toward a logic of strategy transcending time, place, culture, circumstance, and the differences between saints and sinners.
  • a more revealing distinction may lie in temperament: to borrow from Milan Kundera,37 Machiavelli found “lightness of being” bearable. For Augustine—perhaps because traumatized as a youth by a pear tree—it was unendurable.
  • “I judge that it might be true that fortune is arbiter of half our actions, but also that she leaves the other half, or close to it, for us to govern.” Fifty percent fortune, fifty percent man—but zero percent God. Man is, however precariously, on his own.
  • States, Machiavelli suggests, operate similarly. If governed badly, men’s rapacity will soon overwhelm them, whether through internal rebellion or external war. But if run with virtù—his untranslatable term for planning without praying40—states can constrain, if not in all ways control, the workings of fortune, or chance. The skills needed are those of imitation, adaptation, and approximation.
  • Machiavelli commends the study of history, “for since men almost always walk on paths beaten by others and proceed in their actions by imitation . . . , a prudent man should always enter upon the paths beaten by great men, and imitate those who have been most excellent, so that if his own virtue does not reach that far, it is at least in the odor of it.”
  • What, then, to do? It helped that Machiavelli and Berlin had lightness of being, for their answer is the same: don’t sweat it. Learn to live with the contradictions. Machiavelli shows “no trace of agony,” Berlin points out, and he doesn’t either:
  • Eternal truths have little to do with any of this, beyond the assurance that circumstances will change. Machiavelli knows, as did Augustine, that what makes sense in one situation may not in the next. They differ, though, in that Machiavelli, expecting to go to Hell, doesn’t attempt to resolve such disparities. Augustine, hoping for Heaven, feels personally responsible for them. Despite his afflictions, Machiavelli often sees comedy.42 Despite his privileges, Augustine carries a tragic burden of guilt. Machiavelli sweats, but not all the time. Augustine never stops.
  • “Lightness of being,” then, is the ability, if not to find the good in bad things, then at least to remain afloat among them, perhaps to swim or to sail through them, possibly even to take precautions that can keep you dry. It’s not to locate logic in misfortunes, or to show that they’re for the best because they reflect God’s will.
  • Augustine and Machiavelli agree that wars should be fought—indeed that states should be run—by pre-specifiable procedures. Both know that aspirations aren’t capabilities. Both prefer to connect them through checklists, not commandments.43
  • Augustine admits, which is why good men may have to seek peace by shedding blood. The greater privilege, however, is to avert “that calamity which others are under the necessity of producing.” Machiavelli agrees, but notes that a prince so infrequently has this privilege that if he wishes to remain in power he must “learn to be able not to be good,” and to use this proficiency or not use it “according to necessity.”51 As fits man’s fallen state, Augustine sighs. As befits man, Machiavelli simplifies.
  • As Machiavelli’s finest translator has put it: “[J]ustice is no more reasonable than what a person’s prudence tells him he must acquire for himself, or must submit to, because men cannot afford justice in any sense that transcends their own preservation.”53
  • princes need advisers. The adviser can’t tell the prince what to do, but he can suggest what the prince should know. For Machiavelli this means seeking patterns—across time, space, and status—by shifting perspectives. “[J]ust as those who sketch landscapes place themselves down in the plain to consider the nature of mountains . . . and to consider the nature of low places place themselves high atop mountains,
  • Machiavelli embraces, then, a utilitarian morality: you proportion your actions to your objective, not to progress from one nebulous city to another, but because some things have been shown to work and others haven’t.60
  • Who, then, will oversee them? They’ll do it themselves, Machiavelli replies, by balancing power. First, there’ll be a balance among states, unlike older Roman and Catholic traditions of universality. Machiavelli anticipates the statecraft of Richelieu, Metternich, Bismarck,
  • But Machiavelli understands balancing in a second and subtler sense, conveyed more explicitly in The Discourses than in The Prince: [I]t is only in republics that the common good is looked to properly in that all that promotes it is carried out; and, however much this or that private person may be the loser on this account, there are so many who benefit thereby that the common good can be realized in spite of those few who suffer in consequence.64 This idea of an internal equilibrium within which competition strengthens community wouldn’t appear again until Adam Smith unveiled an “invisible hand” in The Wealth of Nations (1776), until the American Founding Fathers drafted and in The Federalist justified constitutional checks and balances (1787–88), and until Immanuel Kant linked republics, however distantly, with Perpetual Peace (1795).
  • Machiavelli’s great transgression, Berlin concluded, was to confirm what everyone knows but no one will admit: that ideals “cannot be attained.” Statecraft, therefore, can never balance realism against idealism: there are only competing realisms. There is no contest, in governing, between politics and morality: there is only politics. And no state respects Christian teaching on saving souls. The incompatibilities are irreconcilable. To deny this is, in Berlin’s words but in Machiavelli’s mind, to “vacillate, fall between two stools, and end in weakness and failure.”
  • And approximation? “[P]rudent archers,” Machiavelli points out, knowing the strength of their bow, “set their aim much higher than the place intended, not to reach such height with their arrow, but to be able with the aid of so high an aim to achieve their plan.”41 For there will be deflection—certainly from gravity, perhaps from wind, who knows from what else? And the target itself will probably be moving.
  • Augustine’s City of God no longer exists on earth. The City of Man, which survives, has no single path to salvation. “[T]he belief that the correct, objectively valid solution to the question of how men should live can in principle be discovered,” Berlin finds, “is itself in principle not true.” Machiavelli thus split open the rock “upon which Western beliefs and lives had been founded.” It was he “who lit the fatal fuse.”
  • Machiavelli’s blood ran colder than was ordinary: he praised Cesare Borgia, for example, and he refused to condemn torture despite having suffered it (Augustine, never tortured, took a similar position).75 Machiavelli was careful, however, to apportion enormities: they should only forestall greater horrors—violent revolution, defeat in war, descent into anarchy, mass killing, or what we would today call “genocide.”
  • Berlin sees in this an “economy of violence,” by which he means holding a “reserve of force always in the background to keep things going in such a way that the virtues admired by [Machiavelli] and by the classical thinkers to whom he appeals can be protected and allowed to flower.”76 It’s no accident that Berlin uses the plural. For it comes closer than the singular, in English, to Machiavelli’s virtù, implying no single standard by which men must live.
  • “[T]here are many different ends that men may seek and still be fully rational,” Berlin insists, “capable of understanding . . . and deriving light from each other.” Otherwise, civilizations would exist in “impenetrable bubble[s],” incomprehensible to anyone on the outside. “Intercommunication between cultures in time and space is possible only because what makes men human is common to them, and acts as a bridge between them. But our values are ours, and theirs are theirs.”
  • Perhaps there are other worlds in which all principles are harmonized, but “it is on earth that we live, and it is here that we must believe and act.”77 By shattering certainty, Machiavelli showed how. “[T]he dilemma has never given men peace since it came to light,” Berlin lightly concludes, “but we have learnt to live with it.”
  • Posterity has long regarded Augustine and Machiavelli as pivots in the history of “western” thought because each, with enduring effects, shifted long-standing relationships between souls and states.
  • Philip promises obedience to God, not his subjects. Elizabeth serves her subjects, fitting God to their interests. The king, looking to Heaven, venerates. The queen, feet on earth, calculates. The differences test the ideas of Augustine and Machiavelli against the demands of statecraft at the dawn of the modern age.
  • Relishing opposites, the queen was constant only in her patriotism, her insistence on keeping ends within means, and her determination—a requirement for pivoting—never to be pinned down.
  • Pivoting requires gyroscopes, and Elizabeth’s were the best of her era. She balanced purposefulness with imagination, guile, humor, timing, and an economy in movement that, however extravagant her display, kept her steady on the tightrope she walked.
  • Machiavelli, thinking gyroscopically, advised his prince to be a lion and a fox, the former to frighten wolves, the latter to detect snares. Elizabeth went him one better by being lion, fox, and female, a combination the crafty Italian might have learned to appreciate. Philip was a grand lion, but he was only a lion.
  • princes can through conscientiousness, Machiavelli warned, become trapped. For a wise ruler “cannot observe faith, nor should he, when such observance turns against him, and the causes that made him promise have been eliminated. . . . Nor does a prince ever lack legitimate causes to color his failure to observe faith.”46
  • What we like to recall as the Elizabethan “golden age” survived only through surveillance and terror: that was another of its contradictions, maintained regretfully with resignation.
  • The queen’s instincts were more humane than those of her predecessors, but too many contemporaries were trying to kill her. “Unlike her sister, Elizabeth never burned men for their faith,” her recent biographer Lisa Hilton has written. “She tortured and hanged them for treason.”60 Toleration, Machiavelli might have said, had turned against Elizabeth. She wanted to be loved—who wouldn’t? It was definitely safer for princes, though, to be feared.
  • “The failure of the Spanish Armada,” Geoffrey Parker has argued, “laid the American continent open to invasion and colonization by northern Europeans, and thus made possible the creation of the United States.” If that’s right, then the future pivoted on a single evening—August 7, 1588—owing to a favorable wind, a clever lord admiral, and a few fiery ships. Had he succeeded, Philip would have required Elizabeth to end all English voyages to America.4
  • In contrast to Spain’s “new world” colonies—and to the territories that France, more recently, had claimed (but barely settled) along the banks of the St. Lawrence, the Great Lakes, and the Ohio and Mississippi rivers—British America “was a society whose political and administrative institutions were more likely to evolve from below than to be imposed from above.”10 That made it a hodgepodge, but also a complex adaptive system.
  • The principles seem at odds—how can supremacies share?—but within that puzzle, the modern historian Robert Tombs has suggested, lay the foundations of England’s post-Stuart political culture: [S]uspicion of Utopias and zealots; trust in common sense and experience; respect for tradition; preference for gradual change; and the view that “compromise” is victory, not betrayal. These things stem from the failure of both royal absolutism and of godly republicanism: costly failures, and fruitful ones.
Javier E

Adam Serwer: White Nationalism's Deep American Roots - The Atlantic - 0 views

  • The concept of “white genocide”—extinction under an onslaught of genetically or culturally inferior nonwhite interlopers—may indeed seem like a fringe conspiracy theory with an alien lineage, the province of neo-Nazis and their fellow travelers. In popular memory, it’s a vestige of a racist ideology that the Greatest Generation did its best to scour from the Earth.
  • History, though, tells a different story.
  • King’s recent question, posed in a New York Times interview, may be appalling: “White nationalist, white supremacist, Western civilization—how did that language become offensive?” But it is apt. “That language” has an American past in need of excavation. Without such an effort, we may fail to appreciate the tenacity of the dogma it expresses, and the difficulty of eradicating it.
  • ...45 more annotations...
  • The cross between a white man and an Indian is an Indian; the cross between a white man and a Negro is a Negro; the cross between a white man and a Hindu is a Hindu; and the cross between any of the three European races and a Jew is a Jew.
  • What is judged extremist today was once the consensus of a powerful cadre of the American elite, well-connected men who eagerly seized on a false doctrine of “race suicide” during the immigration scare of the early 20th century. They included wealthy patricians, intellectuals, lawmakers, even several presidents.
  • Madison Grant. He was the author of a 1916 book called The Passing of the Great Race, which spread the doctrine of race purity all over the globe.
  • Grant’s purportedly scientific argument that the exalted “Nordic” race that had founded America was in peril, and all of modern society’s accomplishments along with it, helped catalyze nativist legislators in Congress to pass comprehensive restrictionist immigration policies in the early 1920s. His book went on to become Adolf Hitler’s “bible,” as the führer wrote to tell him
  • Grant’s doctrine has since been rejuvenated and rebranded by his ideological descendants as “white genocide
  • “Even though the Germans had been directly influenced by Madison Grant and the American eugenics movement, when we fought Germany, because Germany was racist, racism became unacceptable in America. Our enemy was racist; therefore we adopted antiracism as our creed.” Ever since, a strange kind of historical amnesia has obscured the American lineage of this white-nationalist ideology.
  • When Nazism reflected back that vision in grotesque form, wartime denial set in.
  • In 1853, across the Atlantic, Joseph Arthur de Gobineau, a French count, first identified the “Aryan” race as “great, noble, and fruitful in the works of man on this earth.”
  • In 1899, William Z. Ripley, an economist, concluded that Europeans consisted of “three races”: the brave, beautiful, blond “Teutons”; the stocky “Alpines”; and the swarthy “Mediterraneans.”
  • Another leading academic contributor to race science in turn-of-the-century America was a statistician named Francis Walker, who argued in The Atlantic that the new immigrants lacked the pioneer spirit of their predecessors; they were made up of “beaten men from beaten races,” whose offspring were crowding out the fine “native” stock of white people.
  • In 1901 the sociologist Edward A. Ross, who similarly described the new immigrants as “masses of fecund but beaten humanity from the hovels of far Lombardy and Galicia,” coined the term race suicide.
  • it was Grant who synthesized these separate strands of thought into one pseudo-scholarly work that changed the course of the nation’s history. In a nod to wartime politics, he referred to Ripley’s “Teutons” as “Nordics,” thereby denying America’s hated World War I rivals exclusive claim to descent from the world’s master race. He singled out Jews as a source of anxiety disproportionate to their numbers
  • The historian Nell Irvin Painter sums up the race chauvinists’ view in The History of White People (2010): “Jews manipulate the ignorant working masses—whether Alpine, Under-Man, or colored.
  • In The Passing of the Great Race, the eugenic focus on winnowing out unfit individuals made way for a more sweeping crusade to defend against contagion by inferior races. By Grant’s logic, infection meant obliteration:
  • The seed of Nazism’s ultimate objective—the preservation of a pure white race, uncontaminated by foreign blood—was in fact sown with striking success in the United States.
  • Grant, emphasizing the American experience in particular, agreed. In The Passing of the Great Race, he had argued that
  • Teddy Roosevelt, by then out of office, told Grant in 1916 that his book showed “fine fearlessness in assailing the popular and mischievous sentimentalities and attractive and corroding falsehoods which few men dare assail.”
  • President Warren Harding publicly praised one of Grant’s disciples, Lothrop Stoddard, whose book The Rising Tide of Color Against White World-Supremacy offered similar warnings about the destruction of white society by invading dusky hordes. There is “a fundamental, eternal, inescapable difference” between the races, Harding told his audience. “Racial amalgamation there cannot be.
  • Calvin Coolidge, found Grant’s thesis equally compelling. “There are racial considerations too grave to be brushed aside for any sentimental reasons. Biological laws tell us that certain divergent people will not mix or blend,” Coolidge wrote in a 1921 article in Good Housekeeping.The Nordics propagate themselves successfully. With other races, the outcome shows deterioration on both sides. Quality of mind and body suggests that observance of ethnic law is as great a necessity to a nation as immigration law.
  • On Capitol Hill debate raged, yet Republicans and Democrats were converging on the idea that America was a white man’s country, and must stay that way. The influx of foreigners diluted the nation with inferiors unfit for self-government, many politicians in both parties energetically concurred. The Supreme Court chimed in with decisions in a series of cases, beginning in 1901, that assigned the status of “nationals” rather than “citizens” to colonial newcomers.
  • A popular myth of American history is that racism is the exclusive province of the South. The truth is that much of the nativist energy in the U.S. came from old-money elites in the Northeast, and was also fueled by labor struggles in the Pacific Northwest, which had stirred a wave of bigotry that led to the Chinese Exclusion Act of 1882
  • He blended Nordic boosterism with fearmongering, and supplied a scholarly veneer for notions many white citizens already wanted to believe
  • When the Republicans took control of the House in 1919, Johnson became chair of the committee on immigration, “thanks to some shrewd lobbying by the Immigration Restriction League,” Spiro writes. Grant introduced him to a preeminent eugenicist named Harry Laughlin, whom Johnson named the committee’s “expert eugenics agent.” His appointment helped ensure that Grantian concerns about “race suicide” would be a driving force in a quest that culminated, half a decade later, in the Immigration Act of 1924.
  • Meanwhile, the Supreme Court was struggling mightily to define whiteness in a consistent fashion, an endeavor complicated by the empirical flimsiness of race science. In one case after another, the high court faced the task of essentially tailoring its definition to exclude those whom white elites considered unworthy of full citizenship.
  • In 1923, when an Indian veteran named Bhagat Singh Thind—who had fought for the U.S. in World War I—came before the justices with the claim of being Caucasian in the scientific sense of the term, and therefore entitled to the privileges of whiteness, they threw up their hands. In a unanimous ruling against Thind (who was ultimately made a citizen in 1936), Justice George Sutherland wrote:What we now hold is that the words “free white persons” are words of common speech to be interpreted in accordance with the understanding of the common man, synonymous with the word “Caucasian” only as that word is popularly understood.The justices had unwittingly acknowledged a consistent truth about racism, which is that race is whatever those in power say it is.
  • Grant felt his life’s work had come to fruition and, according to Spiro, he concluded, “We have closed the doors just in time to prevent our Nordic population being overrun by the lower races.” Senator Reed announced in a New York Times op-ed, “The racial composition of America at the present time thus is made permanent.” Three years later, in 1927, Johnson held forth in dire but confident tones in a foreword to a book about immigration restriction. “Our capacity to maintain our cherished institutions stands diluted by a stream of alien blood, with all its inherited misconceptions respecting the relationships of the governing power to the governed,” he warned. “The United States is our land … We intend to maintain it so. The day of unalloyed welcome to all peoples, the day of indiscriminate acceptance of all races, has definitely ended.”
  • t was America that taught us a nation should not open its doors equally to all nations,” Adolf Hitler told The New York Times half a decade later, just one year before his elevation to chancellor in January 1933. Elsewhere he admiringly noted that the U.S. “simply excludes the immigration of certain races. In these respects America already pays obeisance, at least in tentative first steps, to the characteristic völkisch conception of the state.”
  • Harry Laughlin, the scientific expert on Representative Johnson’s committee, told Grant that the Nazis’ rhetoric sounds “exactly as though spoken by a perfectly good American eugenist,” and wrote that “Hitler should be made honorary member of the Eugenics Research Association.”
  • What the Nazis “found exciting about the American model didn’t involve just eugenics,
  • “It also involved the systematic degradation of Jim Crow, of American deprivation of basic rights of citizenship like voting.”
  • Nazi lawyers carefully studied how the United States, despite its pretense of equal citizenship, had effectively denied that status to those who were not white. They looked at Supreme Court decisions that withheld full citizenship rights from nonwhite subjects in U.S. colonial territories. They examined cases that drew, as Thind’s had, arbitrary but hard lines around who could be considered “white.
  • Krieger, whom Whitman describes as “the single most important figure in the Nazi assimilation of American race law,” considered the Fourteenth Amendment a problem: In his view, it codified an abstract ideal of equality at odds with human experience, and with the type of country most Americans wanted to live in.
  • In 1917, overriding President Woodrow Wilson’s veto, Congress passed a law that banned immigration not just from Asian but also from Middle Eastern countries and imposed a literacy test on new immigrants
  • it has taken us fifty years to learn that speaking English, wearing good clothes and going to school and to church do not transform a Negro into a white man.
  • The authors of the Fourteenth Amendment, he believed, had failed to see a greater truth as they made good on the promise of the Declaration of Independence that all men are created equal: The white man is more equal than the others.
  • two “rival principles of national unity.” According to one, the U.S. is the champion of the poor and the dispossessed, a nation that draws its strength from its pluralism. According to the other, America’s greatness is the result of its white and Christian origins, the erosion of which spells doom for the national experiment.
  • Grantism, despite its swift wartime eclipse, did not become extinct. The Nazis, initially puzzled by U.S. hostility, underestimated the American commitment to democracy.
  • the South remained hawkish toward Nazi Germany because white supremacists in the U.S. didn’t want to live under a fascist government. What they wanted was a herrenvolk democracy, in which white people were free and full citizens but nonwhites were not.
  • The Nazis failed to appreciate the significance of that ideological tension. They saw allegiance to the American creed as a weakness. But U.S. soldiers of all backgrounds and faiths fought to defend it, and demanded that their country live up to it
  • historical amnesia, the excision of the memory of how the seed of racism in America blossomed into the Third Reich in Europe, has allowed Grantism to be resurrected with a new name
  • Grant’s philosophical framework has found new life among extremists at home and abroad, and echoes of his rhetoric can be heard from the Republican base and the conservative media figures the base trusts, as well as—once again—in the highest reaches of government.
  • The resurrection of race suicide as white genocide can be traced to the white supremacist David Lane, who claimed that “the term ‘racial integration’ is only a euphemism for genocide,” and whose infamous “fourteen words” manifesto, published in the 1990s, distills his credo: “We must secure the existence of our people and a future for white children.” Far-right intellectuals in Europe speak of “the great replacement” of Europeans by nonwhite immigrants and refugees.
  • That nations make decisions about appropriate levels of immigration is not inherently evil or fascist. Nor does the return of Grantian ideas to mainstream political discourse signal an inevitable march to Holocaust-level crimes against humanity.
  • The most benignly intentioned mainstream-media coverage of demographic change in the U.S. has a tendency to portray as justified the fear and anger of white Americans who believe their political power is threatened by immigration—as though the political views of today’s newcomers were determined by genetic inheritance rather than persuasion.
  • The danger of Grantism, and its implications for both America and the world, is very real. External forces have rarely been the gravest threat to the social order and political foundations of the United States. Rather, the source of greatest danger has been those who would choose white purity over a diverse democracy.
Javier E

After Federalist No. 10 | National Affairs - 0 views

  • Federalist No. 10 pertains to the orientation of personal appetites toward public ends, which include both the common good and private rights. The essay recognizes that these appetites cannot be conquered, but they can be conditioned.
  • Madison's solution to the problem of faction — a solution he confines to the four corners of majority rule — is to place majorities in circumstances that encourage deliberation and thus defuse passion.
  • this solution does not depend on any specific constitutional mechanism:
  • ...50 more annotations...
  • Any republic deployed across an extended territory should be relatively free of faction, at least in the aggregate.
  • Yet Madison's solution depends on certain assumptions. Federalist No. 10 assumes politics will occur at a leisurely pace. The regime Madison foresees is relatively passive, not an active manipulator of economic arrangements. And he is able to take for granted a reasonably broad consensus as to the existence if not the content of the public good.
  • These assumptions are now collapsing under the weight of positive government and the velocity of our political life.
  • Given the centrality of Federalist No. 10 to the American constitutional canon, this collapse demands a reckoning. If a pillar of our order is crumbling, something must replace it.
  • That challenge may call for a greater emphasis on the sources of civic virtue and on the means of sustaining it.
  • The possibility that virtue might be coded into the essay is evident at its most elemental level: Federalist No. 10's definition of a faction as a group "united and actuated by some common impulse of passion, or of interest, adverse to the rights of other citizens, or to the permanent and aggregate interests of the community."
  • this definition hinges on an objective understanding of the public good; one cannot comprehend Madison from the perspective of contemporary relativism.
  • Its reader must be committed to a normative concept of the good and occupy a polity in which it is possible for such a concept to be broadly shared.
  • [T]hose who do not believe in an objective moral order cannot 'enter' Madison's system." Thus, belief in such an order, even amid disputes as to its content, constitutes a first unstated assumption of Federalist No. 10.
  • Madison presents a series of choices, repeatedly eliminating one, then bifurcating the other in turn, and eliminating again until he arrives at his solution. One can remove the causes of factions or control their effects. The causes cannot be removed because the propensity to disagree is "sown in the nature of man," arising particularly from the fact that man is "fallible" and his "opinions and his passions...have a reciprocal influence on each other."
  • Precisely because this influence arises from the link between "reason" and "self-love," the latter of which distorts the former, property accounts for "the most common and durable source of factions," the key being its durability.
  • Whereas David Hume's analysis of parties said that those based on self-interest were the most excusable while those based on passions were the most dangerous, Madison warns of the reverse. Those rooted in emotion — including "an attachment to different leaders ambitiously contending for pre-eminence and power" — are the least worrisome precisely because they are based on passions, which Madison believes to be transient.
  • A second assumption of Federalist No. 10 is consequently that irrational passions, which Madison understands to be those not based on interest, are inherently unsustainable and thus are naturally fleeting.
  • Having dismissed minority factions, Madison turns his attention to abusive majorities.
  • if a group is impelled by ill motives, the intrinsic conditions of an extended republic will make it difficult for it to become a majority.
  • A third assumption, then, is that both geographic and constitutional distance will permit the passions to dissipate before their translation into policy.
  • Finally, Madison cautions Jefferson in correspondence about a month before Federalist No. 10's publication that the extended-republic theory "can only hold within a sphere of a mean extent. As in too small a sphere oppressive combinations may be too easily formed agst. the weaker party; so in too extensive a one, a defensive concert may be rendered too difficult against the oppression of those entrusted with the administration."
  • To recapitulate, the assumptions are as follows: The people will share a belief in the existence of an objective moral order, even if they dispute its content; passions, especially when they pertain to attachments or aversions to political leaders, will be unsustainable; government will not dictate the distribution of small economic advantages; geographic and constitutional distance will operate to dissipate passions; and, finally, the territory will not be so large that public opinion cannot form.
  • none of them stands in a form that would be recognizable to Madison today.
  • ASSUMPTIONS UNDONE
  • It is almost universally acknowledged that moral relativism is ascendant in contemporary American society.
  • The question, rather, is whether the foundational assumptions of Federalist No. 10 can withstand the pressure of contemporary communications technology. There is reason to believe they cannot.
  • There is a balance to be struck: Communication is useful insofar as it makes the "mean extent" that was Madison's final assumption larger by enabling the formation of a "defensive concert" through the cultivation of public consensus against an abusive regime. But on Madison's account, the returns on rapid communication should diminish beyond this point because there will be no space in which passions can calm before impulse and decision converge.
  • what is clear is that there are enough opinions dividing the country that any project attempting to form a coherent public will seems doomed.
  • The Madisonian impulse is to look first for institutional solutions that can discipline interest groups. Constitutional mechanisms like judicial review, then, might be used to inhibit factions. But judicial review can be done well or poorly.
  • The empirical conditions not merely of an extensive republic but of 18th-century reality aided in Madison's effort. The deliberate pace of communication did not require an institutional midwife. It was a fact of life. It need hardly be said that, 230 years after the essay's November 1787 publication, this condition no longer obtains. The question is what replaces it.
  • The answer is that the converse of each assumption on which Federalist No. 10 relies is a restraining virtue.
  • If Federalist No. 10 assumes at least consensus as to the existence of an objective morality, pure moral relativism must be challenged.
  • If the immediate translation of preferences into policy is possible but detrimental, patience must intervene. I
  • If technology has erased the constitutional distance between officeholders and constituents, self-restraint and deference may be required.
  • If it has also shrunk attention spans to 140 characters, an ethic of public spiritedness will have to expand them.
  • What unites these is civic virtue, and thus the American regime must now get serious about its recovery
  • He wrote in Federalist No. 55: As there is a degree of depravity in mankind which requires a certain degree of circumspection and distrust, so there are other qualities in human nature which justify a certain portion of esteem and confidence. Republican government presupposes the existence of these qualities in a higher degree than any other form. Were the pictures which have been drawn by the political jealousy of some among us faithful likenesses of the human character, the inference would be that there is not sufficient virtue among men for self-government; and that nothing less than the chains of despotism can restrain them from destroying and devouring one another.
  • At Virginia's ratifying convention, similarly, Madison noted the propensity to assume either the worst or the best from politicians. He replied:
  • But I go on this great republican principle, that the people will have virtue and intelligence to select men of virtue and wisdom. Is there no virtue among us? If there be not, we are in a wretched situation. No theoretical checks — no form of government can render us secure. To suppose that any form of government will secure liberty or happiness without any virtue in the people, is a chimerical idea.
  • Still, the traditional means of inculcating virtue — the family and institutions such as local schools — are themselves under pressure or subject to political capture.
  • A national effort to instill civic virtue would almost certainly careen into the kind of politicization that has been witnessed in Education Department history standards and the like.
  • Consequently, subsidiarity, the diffusion of authority to the most local possible level, would be vital to any effective effort to revive civic virtue. That is, it could not be uniform or imposed from on high. Political leaders could help in cultivating an awareness of its necessity, but not in dictating its precise terms.
  • The first part of this combination is moral virtue, which the ethic of subsidiarity teaches is likelier to come from the home than from school, and from life lessons than from textbooks.
  • Students as early as elementary school routinely learn the virtues of the Bill of Rights, in part because it is shorter and simpler to teach than the main body of the Constitution.
  • The success of civic education is nowhere clearer than in the arguably distorting effect it has had in provoking what Mary Ann Glendon calls "rights talk," the substitution of assertions of rights for persuasive argumentation about politics
  • Of these virtues, patience will surely be the hardest to restore. This is, to be clear, patience not as a private but rather as a civic virtue.
  • It asks that they consider issues in dimensions deeper than a tweet or, more precisely, that they demand that those they elect do so and thus do not expect their passions to be regularly fed.
  • Perhaps the best that can be achieved here is refusing to allow the positive state to reach further into the minutiae of economic life, generating more spaces for minority factions to hide
  • As any reader of Lincoln's Temperance Address knows, neither heroic self-restraint nor clobbering, moralistic education will succeed in inculcating such virtues as patience and moderation. A combined educational program is necessary, and politics in any modern sense can only account for part of it.
  • civic education can achieve constitutional ends. Of course, rights as contemporarily understood are entitlements; they supply us with something. Civic virtue, by contrast, demands something of us, and as such presents a more substantial political challenge.
  • The second is a shift in civic education from the entitlement mentality of the Bill of Rights to the constitutional architecture of the overall regime, with the latter engendering an appreciation of the cadences and distances at which it is intended to function and the limited objects it is intended to attain.
  • While Madison's "mean extent" for a republic has, in the modern United States, far exceeded the scope possible for forming a public will with respect to most particular issues, it may still be possible to form a coherent if thin understanding of the regime and, consequently, a defensive concert to safeguard it.
  • a recognition that virtue is more necessary now than it used to be — when empirical conditions imposed patience and distance — does not rely on virtue in any blind or total sense. It does not, for example, seek to replace the institutional mechanisms Madison elucidates elsewhere with virtue. It simply recognizes that the particular assumptions of Federalist No. 10 no longer operate without added assistance. In other words, as Daniel Mahoney has argued, we must theorize the virtue that the founders could presuppose.
  • The issue, then, is not that civic virtue is all that is important to the Madisonian system; it is that civic virtue is more important than it used to be for one pillar of that system.
1 - 20 of 197 Next › Last »
Showing 20 items per page