Skip to main content

Home/ History Readings/ Group items tagged failed

Rss Feed Group items tagged

Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

12 Rules for Life: An Antidote to Chaos (Jordan B. Peterson) - 0 views

  • RULES? MORE RULES? REALLY? Isn’t life complicated enough, restricting enough, without abstract rules that don’t take our unique, individual situations into account? And given that our brains are plastic, and all develop differently based on our life experiences, why even expect that a few rules might be helpful to us all?
  • “I’ve got some good news…and I’ve got some bad news,” the lawgiver yells to them. “Which do you want first?” “The good news!” the hedonists reply. “I got Him from fifteen commandments down to ten!” “Hallelujah!” cries the unruly crowd. “And the bad?” “Adultery is still in.”
  • Maps of Meaning was sparked by Jordan’s agonized awareness, as a teenager growing up in the midst of the Cold War, that much of mankind seemed on the verge of blowing up the planet to defend their various identities. He felt he had to understand how it could be that people would sacrifice everything for an “identity,”
  • ...297 more annotations...
  • the story of the golden calf also reminds us that without rules we quickly become slaves to our passions—and there’s nothing freeing about that.
  • And the story suggests something more: unchaperoned, and left to our own untutored judgment, we are quick to aim low and worship qualities that are beneath us—in this case, an artificial animal that brings out our own animal instincts in a completely unregulated way.
  • Similarly, in this book Professor Peterson doesn’t just propose his twelve rules, he tells stories, too, bringing to bear his knowledge of many fields as he illustrates and explains why the best rules do not ultimately restrict us but instead facilitate our goals and make for fuller, freer lives.
  • Peterson wasn’t really an “eccentric”; he had sufficient conventional chops, had been a Harvard professor, was a gentleman (as cowboys can be) though he did say damn and bloody a lot, in a rural 1950s sort of way. But everyone listened, with fascination on their faces, because he was in fact addressing questions of concern to everyone at the table.
  • unlike many academics who take the floor and hold it, if someone challenged or corrected him he really seemed to like it. He didn’t rear up and neigh. He’d say, in a kind of folksy way, “Yeah,” and bow his head involuntarily, wag it if he had overlooked something, laughing at himself for overgeneralizing. He appreciated being shown another side of an issue, and it became clear that thinking through a problem was, for him, a dialogic process.
  • for an egghead Peterson was extremely practical. His examples were filled with applications to everyday life: business management, how to make furniture (he made much of his own), designing a simple house, making a room beautiful (now an internet meme) or in another, specific case related to education, creating an online writing project that kept minority students from dropping out of school by getting them to do a kind of psychoanalytic exercise on themselves,
  • These Westerners were different: self-made, unentitled, hands on, neighbourly and less precious than many of their big-city peers, who increasingly spend their lives indoors, manipulating symbols on computers. This cowboy psychologist seemed to care about a thought only if it might, in some way, be helpful to someone.
  • I was drawn to him because here was a clinician who also had given himself a great books education, and who not only loved soulful Russian novels, philosophy and ancient mythology, but who also seemed to treat them as his most treasured inheritance. But he also did illuminating statistical research on personality and temperament, and had studied neuroscience. Though trained as a behaviourist, he was powerfully drawn to psychoanalysis with its focus on dreams, archetypes, the persistence of childhood conflicts in the adult, and the role of defences and rationalization in everyday life. He was also an outlier in being the only member of the research-oriented Department of Psychology at the University of Toronto who also kept a clinical practice.
  • Maps of Meaning, published nearly two decades ago, shows Jordan’s wide-ranging approach to understanding how human beings and the human brain deal with the archetypal situation that arises whenever we, in our daily lives, must face something we do not understand.
  • The brilliance of the book is in his demonstration of how rooted this situation is in evolution, our DNA, our brains and our most ancient stories. And he shows that these stories have survived because they still provide guidance in dealing with uncertainty, and the unavoidable unknown.
  • this is why many of the rules in this book, being based on Maps of Meaning, have an element of universality to them.
  • We are ambivalent about rules, even when we know they are good for us. If we are spirited souls, if we have character, rules seem restrictive, an affront to our sense of agency and our pride in working out our own lives. Why should we be judged according to another’s rule?
  • And he felt he had to understand the ideologies that drove totalitarian regimes to a variant of that same behaviour: killing their own citizens.
  • Ideologies are simple ideas, disguised as science or philosophy, that purport to explain the complexity of the world and offer remedies that will perfect it.
  • Ideologues are people who pretend they know how to “make the world a better place” before they’ve taken care of their own chaos within.
  • Ideologies are substitutes for true knowledge, and ideologues are always dangerous when they come to power, because a simple-minded I-know-it-all approach is no match for the complexity of existence.
  • To understand ideology, Jordan read extensively about not only the Soviet gulag, but also the Holocaust and the rise of Nazism. I had never before met a person, born Christian and of my generation, who was so utterly tormented by what happened in Europe to the Jews, and who had worked so hard to understand how it could have occurred.
  • I saw what now millions have seen online: a brilliant, often dazzling public speaker who was at his best riffing like a jazz artist; at times he resembled an ardent Prairie preacher (not in evangelizing, but in his passion, in his ability to tell stories that convey the life-stakes that go with believing or disbelieving various ideas). Then he’d just as easily switch to do a breathtakingly systematic summary of a series of scientific studies. He was a master at helping students become more reflective, and take themselves and their futures seriously. He taught them to respect many of the greatest books ever written. He gave vivid examples from clinical practice, was (appropriately) self-revealing, even of his own vulnerabilities, and made fascinating links between evolution, the brain and religious stories.
  • Above all, he alerted his students to topics rarely discussed in university, such as the simple fact that all the ancients, from Buddha to the biblical authors, knew what every slightly worn-out adult knows, that life is suffering.
  • chances are, if you or someone you love is not suffering now, they will be within five years, unless you are freakishly lucky. Rearing kids is hard, work is hard, aging, sickness and death are hard, and Jordan emphasized that doing all that totally on your own, without the benefit of a loving relationship, or wisdom, or the psychological insights of the greatest psychologists, only makes it harder.
  • focused on triumphant heroes. In all these triumph stories, the hero has to go into the unknown, into an unexplored territory, and deal with a new great challenge and take great risks. In the process, something of himself has to die, or be given up, so he can be reborn and meet the challenge. This requires courage, something rarely discussed in a psychology class or textbook.
  • Jordan
  • views of his first YouTube statements quickly numbered in the hundreds of thousands. But people have kept listening because what he is saying meets a deep and unarticulated need. And that is because alongside our wish to be free of rules, we all search for structure.
  • the first generation to have been so thoroughly taught two seemingly contradictory ideas about morality, simultaneously—at their schools, colleges and universities, by many in my own generation. This contradiction has left them at times disoriented and uncertain, without guidance and, more tragically, deprived of riches they don’t even know exist.
  • morality and the rules associated with it are just a matter of personal opinion or happenstance, “relative to” or “related to” a particular framework, such as one’s ethnicity, one’s upbringing, or the culture or historical…
  • The first idea or teaching is that morality is relative, at best a…
  • So, the decent thing to do—once it becomes apparent how arbitrary your, and your society’s, “moral values” are—is to show tolerance for people who think differently, and…
  • for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an adult can…
  • That emphasis on tolerance is so paramount that for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an…
  • And so a generation has been raised untutored in what was once called, aptly, “practical wisdom,” which guided previous generations. Millennials, often told they have received the finest education available anywhere, have actually…
  • professors, chose to devalue thousands of years of human knowledge about how to acquire virtue, dismissing it as passé, “…
  • They were so successful at it that the very word “virtue” sounds out of date, and someone using it appears…
  • The study of virtue is not quite the same as the study of morals (right and wrong, good and evil). Aristotle defined the virtues simply as the ways of behaving that are most conducive to happiness in life. Vice was…
  • Cultivating judgment about the difference between virtue and vice is the beginning of wisdom, something…
  • By contrast, our modern relativism begins by asserting that making judgments about how to live is impossible, because there is no real good, and no…
  • Thus relativism’s closest approximation to “virtue” is “tolerance.” Only tolerance will provide social cohesion between different groups, and save us from harming each other. On Facebook and other forms of social media, therefore, you signal your so-called…
  • Intolerance of others’ views (no matter how ignorant or incoherent they may be) is not simply wrong; in a world where there is no right or wrong, it is worse: it is a sign you are…
  • But it turns out that many people cannot tolerate the vacuum—the chaos—which is inherent in life, but made worse by this moral relativism; they cannot live without a moral compass,…
  • So, right alongside relativism, we find the spread of nihilism and despair, and also the opposite of moral relativism: the blind certainty offered by ideologies…
  • Dr. Norman Doidge, MD, is the author of The Brain That Changes Itself
  • so we arrive at the second teaching that millennials have been bombarded with. They sign up for a humanities course, to study the greatest books ever written. But they’re not assigned the books; instead they are given…
  • (But the idea that we can easily separate facts and values was and remains naive; to some extent, one’s values determine what one will pay…
  • For the ancients, the discovery that different people have different ideas about how, practically, to live, did not paralyze them; it deepened their understanding of humanity and led to some of the most satisfying conversations human beings have ever had, about how life might be lived.
  • Modern moral relativism has many sources. As we in the West learned more history, we understood that different epochs had different moral codes. As we travelled the seas and explored the globe, we learned of far-flung tribes on different continents whose different moral codes made sense relative to, or within the framework of, their societies. Science played a role, too, by attacking the religious view of the world, and thus undermining the religious grounds for ethics and rules. Materialist social science implied that we could divide the world into facts (which all could observe, and were objective and “real”) and values (…
  • it seems that all human beings are, by some kind of biological endowment, so ineradicably concerned with morality that we create a structure of laws and rules wherever we are. The idea that human life can be free of moral concerns is a fantasy.
  • given that we are moral animals, what must be the effect of our simplistic modern relativism upon us? It means we are hobbling ourselves by pretending to be something we are not. It is a mask, but a strange one, for it mostly deceives the one who wears it.
  • Far better to integrate the best of what we are now learning with the books human beings saw fit to preserve over millennia, and with the stories that have survived, against all odds, time’s tendency to obliterate.
  • these really are rules. And the foremost rule is that you must take responsibility for your own life. Period.
  • Jordan’s message that each individual has ultimate responsibility to bear; that if one wants to live a full life, one first sets one’s own house in order; and only then can one sensibly aim to take on bigger responsibilities.
  • if it’s uncertain that our ideals are attainable, why do we bother reaching in the first place? Because if you don’t reach for them, it is certain you will never feel that your life has meaning.
  • And perhaps because, as unfamiliar and strange as it sounds, in the deepest part of our psyche, we all want to be judged.
  • Instead of despairing about these differences in moral codes, Aristotle argued that though specific rules, laws and customs differed from place to place, what does not differ is that in all places human beings, by their nature, have a proclivity to make rules, laws and customs.
  • Freud never argued (as do some who want all culture to become one huge group therapy session) that one can live one’s entire life without ever making judgments, or without morality. In fact, his point in Civilization and Its Discontents is that civilization only arises when some restraining rules and morality are in place.
  • Aleksandr Solzhenitsyn, the great documenter of the slave-labour-camp horrors of the latter, once wrote that the “pitiful ideology” holding that “human beings are created for happiness” was an ideology “done in by the first blow of the work assigner’s cudgel.”1 In a crisis, the inevitable suffering that life entails can rapidly make a mockery of the idea that happiness is the proper pursuit of the individual. On the radio show, I suggested, instead, that a deeper meaning was required. I noted that the nature of such meaning was constantly re-presented in the great stories of the past, and that it had more to do with developing character in the face of suffering than with happiness.
  • I proposed in Maps of Meaning that the great myths and religious stories of the past, particularly those derived from an earlier, oral tradition, were moral in their intent, rather than descriptive. Thus, they did not concern themselves with what the world was, as a scientist might have it, but with how a human being should act.
  • I suggested that our ancestors portrayed the world as a stage—a drama—instead of a place of objects. I described how I had come
  • to believe that the constituent elements of the world as drama were order and chaos, and not material things.
  • Order is where the people around you act according to well-understood social norms, and remain predictable and cooperative. It’s the world of social structure, explored territory, and familiarity. The state of Order is typically portrayed, symbolically—imaginatively—as masculine.
  • Chaos, by contrast, is where—or when—something unexpected happens.
  • As the antithesis of symbolically masculine order, it’s presented imaginatively as feminine. It’s the new and unpredictable suddenly emerging in the midst of the commonplace familiar. It’s Creation and Destruction,
  • Order is the white, masculine serpent; Chaos, its black, feminine counterpart. The black dot in the white—and the white in the black—indicate the possibility of transformation: just when things seem secure, the unknown can loom, unexpectedly and large. Conversely, just when everything seems lost, new order can emerge from catastrophe and chaos.
  • For the Taoists, meaning is to be found on the border between the ever-entwined pair. To walk that border is to stay on the path of life, the divine Way. And that’s much better than happiness.
  • trying to address a perplexing problem: the reason or reasons for the nuclear standoff of the Cold War. I couldn’t understand how belief systems could be so important to people that they were willing to risk the destruction of the world to protect them. I came to realize that shared belief systems made people intelligible to one another—and that the systems weren’t just about belief.
  • People who live by the same code are rendered mutually predictable to one another. They act in keeping with each other’s expectations and desires. They can cooperate. They can even compete peacefully, because everyone knows what to expect from everyone else.
  • Shared beliefs simplify the world, as well, because people who know what to expect from one another can act together to tame the world. There is perhaps nothing more important than the maintenance of this organization—this simplification. If it’s threatened, the great ship of state rocks.
  • It isn’t precisely that people will fight for what they believe. They will fight, instead, to maintain the match between what they believe, what they expect, and what they desire. They will fight to maintain the match between what they expect and how everyone is acting. It is precisely the maintenance of that match that enables everyone
  • There’s more to it, too. A shared cultural system stabilizes human interaction, but is also a system of value—a hierarchy of value, where some things are given priority and importance and others are not. In the absence of such a system of value, people simply cannot act. In fact, they can’t even perceive, because both action and perception require a goal, and a valid goal is, by necessity, something valued.
  • We experience much of our positive emotion in relation to goals. We are not happy, technically speaking, unless we see ourselves progressing—and the very idea of progression implies value.
  • Worse yet is the fact that the meaning of life without positive value is not simply neutral. Because we are vulnerable and mortal, pain and anxiety are an integral part of human existence. We must have something to set against the suffering that is intrinsic to Being.*2 We must have the meaning inherent in a profound system of value or the horror of existence rapidly becomes paramount. Then, nihilism beckons, with its hopelessness and despair.
  • So: no value, no meaning. Between value systems, however, there is the possibility of conflict. We are thus eternally caught between the most diamantine rock and the hardest of places:
  • loss of group-centred belief renders life chaotic, miserable, intolerable; presence of group-centred belief makes conflict with other groups inevitable.
  • In the West, we have been withdrawing from our tradition-, religion- and even nation-centred cultures, partly to decrease the danger of group conflict. But we are increasingly falling prey to the desperation of meaninglessness, and that is no improvement at all.
  • While writing Maps of Meaning, I was (also) driven by the realization that we can no longer afford conflict—certainly not on the scale of the world conflagrations of the twentieth century.
  • I came to a more complete, personal realization of what the great stories of the past continually insist upon: the centre is occupied by the individual.
  • It is possible to transcend slavish adherence to the group and its doctrines and, simultaneously, to avoid the pitfalls of its opposite extreme, nihilism. It is possible, instead, to find sufficient meaning in individual consciousness and experience.
  • How could the world be freed from the terrible dilemma of conflict, on the one hand, and psychological and social dissolution, on the other? The answer was this: through the elevation and development of the individual, and through the willingness of everyone to shoulder the burden of Being and to take the heroic path. We must each adopt as much responsibility as possible for individual life, society and the world.
  • We must each tell the truth and repair what is in disrepair and break down and recreate what is old and outdated. It is in this manner that we can and must reduce the suffering that poisons the world. It’s asking a lot. It’s asking for everything.
  • the alternative—the horror of authoritarian belief, the chaos of the collapsed state, the tragic catastrophe of the unbridled natural world, the existential angst and weakness of the purposeless
  • individual—is clearly worse.
  • a title: 12 Rules for Life: An Antidote to Chaos. Why did that one rise up above all others? First and foremost, because of its simplicity. It indicates clearly that people need ordering principles, and that chaos otherwise beckons.
  • We require rules, standards, values—alone and together. We’re pack animals, beasts of burden. We must bear a load, to justify our miserable existence. We require routine and tradition. That’s order. Order can become excessive, and that’s not good, but chaos can swamp us, so we drown—and that is also not good. We need to stay on the straight and narrow path.
  • I hope that these rules and their accompanying essays will help people understand what they already know: that the soul of the individual eternally hungers for the heroism of genuine Being, and that the willingness to take on that responsibility is identical to the decision to live a meaningful life.
  • RULE 1   STAND UP STRAIGHT WITH YOUR SHOULDERS BACK
  • Because territory matters, and because the best locales are always in short supply, territory-seeking among animals produces conflict. Conflict, in turn, produces another problem: how to win or lose without the disagreeing parties incurring too great a cost.
  • It’s winner-take-all in the lobster world, just as it is in human societies, where the top 1 percent have as much loot as the bottom 50 percent11—and where the richest eighty-five people have as much as the bottom three and a half billion.
  • This principle is sometimes known as Price’s law, after Derek J. de Solla Price,13 the researcher who discovered its application in science in 1963. It can be modelled using an approximately L-shaped graph, with number of people on the vertical axis, and productivity or resources on the horizontal.
  • Instead of undertaking the computationally difficult task of identifying the best man, the females outsource the problem to the machine-like calculations of the dominance hierarchy. They let the males fight it out and peel their paramours from the top.
  • The dominant male, with his upright and confident posture, not only gets the prime real estate and easiest access to the best hunting grounds. He also gets all the girls. It is exponentially more worthwhile to be successful, if you are a lobster, and male.
  • dominance hierarchies have been an essentially permanent feature of the environment to which all complex life has adapted. A third of a billion years ago, brains and nervous systems were comparatively simple. Nonetheless, they already had the structure and neurochemistry necessary to process information about status and society. The importance of this fact can hardly be overstated.
  • evolution works, in large part, through variation and natural selection. Variation exists for many reasons, including gene-shuffling (to put it simply) and random mutation. Individuals vary within a species for such reasons. Nature chooses from among them, across time. That theory, as stated, appears to account for the continual alteration of life-forms over the eons.
  • But there’s an additional question lurking under the surface: what exactly is the “nature” in “natural selection”? What exactly is “the environment” to which animals adapt?
  • Nature “selects.” The idea of selects contains implicitly nested within it the idea of fitness. It is “fitness” that is “selected.” Fitness, roughly speaking, is the probability that a given organism will leave offspring (will propagate its genes through time). The “fit” in “fitness” is therefore the matching of organismal attribute to environmental demand.
  • But nature, the selecting agent, is not a static selector—not in any simple sense.
  • As the environment supporting a species transforms and changes, the features that make a given individual successful in surviving and reproducing also transform and change. Thus, the theory of natural selection does not posit creatures matching themselves ever more precisely to a template specified by the world. It is more that creatures are in a dance with nature, albeit one that is deadly.
  • Nature is not simply dynamic, either. Some things change quickly, but they are nested within other things that change less quickly (music
  • It’s chaos, within order, within chaos, within higher order. The order that is most real is the order that is most unchanging—and that is not necessarily the order that is most easily seen. The leaf, when perceived, might blind the observer to the tree. The tree can blind him to the forest.
  • It is also a mistake to conceptualize nature romantically.
  • Unfortunately, “the environment” is also elephantiasis and guinea worms (don’t ask), anopheles mosquitoes and malaria, starvation-level droughts, AIDS and the Black Plague.
  • It is because of the existence of such things, of course, that we attempt to modify our surroundings, protecting our children, building cities and transportation systems and growing food and generating power.
  • this brings us to a third erroneous concept: that nature is something strictly segregated from the cultural constructs that have emerged within it.
  • It does not matter whether that feature is physical and biological, or social and cultural. All that matters, from a Darwinian perspective, is permanence—and the dominance hierarchy, however social or cultural it might appear, has been around for some half a billion years.
  • The dominance hierarchy is not capitalism. It’s not communism, either, for that matter. It’s not the military-industrial complex. It’s not the patriarchy—that disposable, malleable, arbitrary cultural artefact. It’s not even a human creation; not in the most profound sense. It is instead a near-eternal aspect of the environment, and much of what is blamed on these more ephemeral manifestations is a consequence of its unchanging existence.
  • We were struggling for position before we had skin, or hands, or lungs, or bones. There is little more natural than culture. Dominance hierarchies are older than trees.
  • The part of our brain that keeps track of our position in the dominance hierarchy is therefore exceptionally ancient and fundamental.17 It is a master control system, modulating our perceptions, values, emotions, thoughts and actions. It powerfully affects every aspect of our Being, conscious and unconscious alike.
  • The ancient part of your brain specialized for assessing dominance watches how you are treated by other people. On that evidence, it renders a determination of your value and assigns you a status. If you are judged by your peers as of little worth, the counter restricts serotonin availability. That makes you much more physically and psychologically reactive to any circumstance or event that might produce emotion, particularly if it is negative. You need that reactivity. Emergencies are common at the bottom, and you must be ready to survive. Unfortunately, that physical hyper-response, that constant alertness, burns up a lot of precious energy and physical resources.
  • It will leave you far more likely to live, or die, carelessly, for a rare opportunity at pleasure, when it manifests itself. The physical demands of emergency preparedness will wear you down in every way.21
  • If you have a high status, on the other hand, the counter’s cold, pre-reptilian mechanics assume that your niche is secure, productive
  • You can delay gratification, without forgoing it forever. You can afford to be a reliable and thoughtful citizen.
  • Sometimes, however, the counter mechanism can go wrong. Erratic habits of sleeping and eating can interfere with its function. Uncertainty can throw it for a loop. The body, with its various parts,
  • needs
  • to function like a well-rehearsed orchestra. Every system must play its role properly, and at exactly the right time, or noise and chaos ensue. It is for this reason that routine is so necessary. The acts of life we repeat every day need to be automatized. They must be turned into stable and reliable habits, so they lose their complexity and gain predictability and simplicity.
  • It is for such reasons that I always ask my clinical clients first about sleep. Do they wake up in the morning at approximately the time the typical person wakes up, and at the same time every day?
  • The next thing I ask about is breakfast. I counsel my clients to eat a fat and protein-heavy breakfast as soon as possible after they awaken (no simple carbohydrates, no sugars,
  • I have had many clients whose anxiety was reduced to subclinical levels merely because they started to sleep on a predictable schedule and eat breakfast.
  • Other bad habits can also interfere with the counter’s accuracy.
  • There are many systems of interaction between brain, body and social world that can get caught in positive feedback loops. Depressed people, for example, can start feeling useless and burdensome, as well as grief-stricken and pained. This makes them withdraw from contact with friends and family. Then the withdrawal makes them more lonesome and isolated, and more likely to feel useless and burdensome. Then they withdraw more. In this manner, depression spirals and amplifies.
  • If someone is badly hurt at some point in life—traumatized—the dominance counter can transform in a manner that makes additional hurt more rather than less likely. This often happens in the case of people, now adults, who were viciously bullied during childhood or adolescence. They become anxious and easily upset. They shield themselves with a defensive crouch, and avoid the direct eye contact interpretable as a dominance challenge.
  • With their capacity for aggression strait-jacketed within a too-narrow morality, those who are only or merely compassionate and self-sacrificing (and naïve and exploitable) cannot call forth the genuinely righteous and appropriately self-protective anger necessary to defend themselves. If you can bite, you generally don’t have to. When skillfully integrated, the ability to respond with aggression and violence decreases rather than increases the probability that actual aggression will become necessary.
  • Naive, harmless people usually guide their perceptions and actions with a few simple axioms: people are basically good; no one really wants to hurt anyone else; the threat (and, certainly, the use) of force, physical or otherwise, is wrong. These axioms collapse, or worse, in the presence of
  • individuals who are genuinely malevolent.27
  • I have had clients who were terrified into literally years of daily hysterical convulsions by the sheer look of malevolence on their attackers’ faces. Such individuals typically come from hyper-sheltered families, where nothing
  • terrible is allowed to exist, and everything is fairyland wonderful (or else).
  • When the wakening occurs—when once-naïve people recognize in themselves the seeds of evil and monstrosity, and see themselves as dangerous (at least potentially)— their fear decreases. They develop more self-respect. Then, perhaps, they begin to resist oppression. They see that they have the ability to withstand, because they are terrible too. They see they can and must stand up, because they begin to understand how genuinely monstrous they will become, otherwise,
  • There is very little difference between the capacity for mayhem and destruction, integrated, and strength of character. This is one of the most difficult lessons of life.
  • even if you came by your poor posture honestly—even if you were unpopular or bullied at home or in grade school28—it’s not necessarily appropriate now. Circumstances change. If you slump around, with the same bearing that characterizes a defeated lobster, people will assign you a lower status, and the old counter that you share with crustaceans, sitting at the very base of your brain, will assign you a low dominance number.
  • the other, far more optimistic lesson of Price’s law and the Pareto distribution: those who start to have will probably get more.
  • Some of these upwardly moving loops can occur in your own private, subjective space.
  • If you are asked to move the muscles one by one into a position that looks happy, you will report feeling happier. Emotion is partly bodily expression, and can be amplified (or dampened) by that expression.29
  • To stand up straight with your shoulders back is to accept the terrible responsibility of life, with eyes wide open.
  • It means deciding to voluntarily transform the chaos of potential into the realities of habitable order. It means adopting the burden of self-conscious vulnerability, and accepting the end of the unconscious paradise of childhood, where finitude and mortality are only dimly comprehended. It means willingly undertaking the sacrifices necessary to generate a productive and meaningful reality (it means acting to please God, in the ancient language).
  • So, attend carefully to your posture. Quit drooping and hunching around. Speak your mind. Put your desires forward, as if you had a right to them—at least the same right as others. Walk tall and gaze forthrightly ahead. Dare to be dangerous. Encourage the serotonin to flow plentifully through the neural pathways desperate for its calming influence.
  • Thus emboldened, you will embark on the voyage of your life, let your light shine, so to speak, on the heavenly hill, and pursue your rightful destiny. Then the meaning of your life may be sufficient to keep the corrupting influence of mortal despair at bay. Then you may be able to accept the terrible burden of the World, and find joy.
  • RULE 2   TREAT YOURSELF LIKE SOMEONE YOU ARE RESPONSIBLE FOR HELPING
  • People are better at filling and properly administering prescription medication to their pets than to themselves. That
  • It is difficult to conclude anything from this set of facts except that people appear to love their dogs, cats, ferrets and birds (and maybe even their lizards) more than themselves. How horrible is that? How much shame must exist, for something like that to be true? What could it be about people that makes them prefer their pets to themselves?
  • To understand Genesis 1, the Priestly story, with its insistence on speech as the fundamental creative force, it is first necessary to review a few fundamental, ancient assumptions (these are markedly different in type and intent from the assumptions of science, which are, historically speaking, quite novel).
  • those who existed during the distant time in which the foundational epics of our culture emerged were much more concerned with the actions that dictated survival (and with interpreting the world in a manner commensurate with that goal) than with anything approximating what we now understand as objective truth.
  • Before the dawn of the scientific worldview, reality was construed differently. Being was understood as a place of action, not a place of things.31 It was understood as something more akin to story or drama. That story or drama was lived, subjective experience, as it manifested itself moment to moment in the consciousness of every living person.
  • subjective pain. That’s something so real no argument can stand against it. Everyone acts as if their pain is real—ultimately, finally real. Pain matters, more than matter matters. It is for this reason, I believe, that so many of the world’s traditions regard the suffering attendant upon existence as the irreducible truth of Being.
  • In any case, that which we subjectively experience can be likened much more to a novel or a movie than to a scientific description of physical reality.
  • The Domain, Not of Matter, but of What Matters
  • the world of experience has primal constituents, as well. These are the necessary elements whose interactions define drama and fiction. One of these is chaos. Another is order. The third (as there are three) is the process that mediates between the two, which appears identical to what modern people call consciousness.
  • Chaos is the domain of ignorance itself. It’s unexplored territory. Chaos is what extends, eternally and without limit, beyond the boundaries of all states, all ideas, and all disciplines. It’s the foreigner, the stranger, the member of another gang, the rustle in the bushes in the night-time,
  • It is, in short, all those things and situations we neither know nor understand.
  • Chaos is also the formless potential from which the God of Genesis 1 called forth order using language at the beginning of time. It’s the same potential from which we, made in that Image, call forth the novel and ever-changing moments of our lives. And Chaos is freedom, dreadful freedom, too.
  • Order, by contrast, is explored territory. That’s the hundreds-of-millions-of-years-old hierarchy of place, position and authority. That’s the structure of society. It’s the structure provided by biology, too—particularly insofar as you are adapted, as you are, to the structure of society. Order is tribe, religion, hearth, home and country.
  • Order is the public façade we’re called upon to wear, the politeness of a gathering of civilized strangers, and the thin ice on which we all skate. Order is the place where the behavior of the world matches our expectations and our desires; the place where all things turn out the way we want them to.
  • But order is sometimes tyranny and stultification, as well, when the demand for certainty and uniformity and purity becomes too one-sided.
  • In order, we’re able to think about things in the long term. There, things work, and we’re stable, calm and competent. We seldom leave places we
  • understand—geographical or conceptual—for that reason, and we certainly do not like it when we are compelled to or when it happens accidentally.
  • When the same person betrays you, sells you out, you move from the daytime world of clarity and light to the dark underworld of chaos, confusion and despair. That’s the same move you make, and the same place you visit, when the company you work for starts to fail and your job is placed in doubt.
  • Before the Twin Towers fell—that was order. Chaos manifested itself afterward. Everyone felt it. The very air became uncertain. What exactly was it that fell? Wrong question. What exactly remained standing? That was the issue at hand.
  • Chaos is the deep ocean bottom to which Pinocchio voyaged to rescue his father from Monstro, whale and fire-breathing dragon. That journey into darkness and rescue is the most difficult thing a puppet must do, if he wants to be real; if he wants to extract himself from the temptations of deceit and acting and victimization and impulsive pleasure and totalitarian subjugation; if he wants to take his place as a genuine Being in the world.
  • Chaos is the new place and time that emerges when tragedy strikes suddenly, or malevolence reveals its paralyzing visage, even in the confines of your own home. Something unexpected or undesired can always make its appearance, when a plan is being laid out, regardless of how familiar the circumstances.
  • Our brains respond instantly when chaos appears, with simple, hyper-fast circuits maintained from the ancient days, when our ancestors dwelled in trees, and snakes struck in a flash.32 After that nigh-instantaneous, deeply reflexive bodily response comes the later-evolving, more complex but slower responses of emotions—and, after that, comes thinking, of the higher order, which can extend over seconds, minutes or years. All that response is instinctive, in some sense—but the faster the response, the more instinctive.
  • Things or objects are part of the objective world. They’re inanimate; spiritless. They’re dead. This is not true of chaos and order. Those are perceived, experienced and understood (to the degree that they are understood at all) as personalities—and that is just as true of the perceptions, experiences and understanding of modern people as their ancient forebears. It’s just that moderners don’t notice.
  • Perception of things as entities with personality also occurs before perception of things as things. This is particularly true of the action of others,34 living others, but we also see the non-living “objective world” as animated, with purpose and intent.
  • This is because of the operation of what psychologists have called “the hyperactive agency detector” within us.35 We evolved, over millennia, within intensely social circumstances. This means that the most significant elements of our environment of origin were personalities, not things, objects or situations.
  • The personalities we have evolved to perceive have been around, in predictable form, and in typical, hierarchical configurations, forever, for all intents and purposes. They have been…
  • the category of “parent” and/or “child” has been around for 200 million years. That’s longer than birds have existed. That’s longer than flowers have grown. It’s not a billion years, but it’s still a very long time. It’s plenty long enough for male and female and parent and child to serve as vital and fundamental parts of the environment to which we have adapted. This means that male and female and parent and child are…
  • Our brains are deeply social. Other creatures (particularly, other humans) were crucially important to us as we lived, mated and evolved. Those creatures were…
  • From a Darwinian perspective, nature—reality itself; the environment, itself—is what selects. The environment cannot be defined in any more fundamental manner. It is not mere inert matter. Reality itself is whatever we contend with when we are striving to survive and reproduce. A…
  • as our brain capacity increased and we developed curiosity to spare, we became increasingly aware of and curious about the nature of the world—what we eventually conceptualized as the objective…
  • “outside” is not merely unexplored physical territory. Outside is outside of what we currently understand—and understanding is dealing with and coping with…
  • when we first began to perceive the unknown, chaotic, non-animal world, we used categories that had originally evolved to represent the pre-human animal social world. Our minds are far older than mere…
  • Our most…
  • category—as old, in some sense, as the sexual act itself—appears to be that of sex, male and female. We appear to have taken that primordial knowledge of structured, creative opposition and…
  • Order, the known, appears symbolically associated with masculinity (as illustrated in the aforementioned yang of the Taoist yin-yang symbol). This is perhaps because the primary…
  • Chaos—the unknown—is symbolically associated with the feminine. This is partly because all the things we have come to know were born, originally, of the unknown, just as all beings we encounter were born of mothers. Chaos is mater, origin, source, mother; materia, the substance from which all things are made.
  • In its positive guise, chaos is possibility itself, the source of ideas, the mysterious realm of gestation and birth. As a negative force, it’s the impenetrable darkness of a cave and the accident by the side of the road.
  • Chaos, the eternal feminine, is also the crushing force of sexual selection.
  • Most men do not meet female human standards. It is for this reason that women on dating sites rate 85 percent of men as below average in attractiveness.40
  • Women’s proclivity to say no, more than any other force, has shaped our evolution into the creative, industrious, upright, large-brained (competitive, aggressive, domineering) creatures that we are.42 It is Nature as Woman who says, “Well, bucko, you’re good enough for a friend, but my experience of you so far has not indicated the suitability of your genetic material for continued propagation.”
  • Many things begin to fall into place when you begin to consciously understand the world in this manner. It’s as if the knowledge of your body and soul falls into alignment with the knowledge of your intellect.
  • And there’s more: such knowledge is proscriptive, as well as descriptive. This is the kind of knowing what that helps you know how. This is the kind of is from which you can derive an ought. The Taoist juxtaposition of yin and yang, for example, doesn’t simply portray chaos and order as the fundamental elements of Being—it also tells you how to act.
  • The Way, the Taoist path of life, is represented by (or exists on) the border between the twin serpents. The Way is the path of proper Being. It’s the same Way as that referred to by Christ in John 14:6: I am the way, and the truth and the life. The same idea is expressed in Matthew 7:14: Because strait is the gate, and narrow is the way, which leadeth unto life, and few there be that find it.
  • We eternally inhabit order, surrounded by chaos. We eternally occupy known territory, surrounded by the unknown. We experience meaningful engagement when we mediate appropriately between them. We are adapted, in the deepest Darwinian sense, not to the world of objects, but to the meta-realities of order and chaos, yang and yin. Chaos and order make up the eternal, transcendent environment of the living.
  • To straddle that fundamental duality is to be balanced: to have one foot firmly planted in order and security, and the other in chaos, possibility, growth and adventure.
  • Chaos and order are fundamental elements because every lived situation (even every conceivable lived situation) is made up of both.
  • you need to place one foot in what you have mastered and understood and the other in what you are currently exploring and mastering. Then you have positioned yourself where the terror of existence is under control and you are secure, but where you are also alert and engaged. That is where there is something new to master and some way that you can be improved. That is where meaning is to be found.
  • The serpent in Eden therefore means the same thing as the black dot in the yin side of the Taoist yin/yang symbol of totality—that is, the possibility of the unknown and revolutionary suddenly manifesting itself where everything appears calm.
  • The outside, chaos, always sneaks into the inside, because nothing can be completely walled off from the rest of reality. So even the ultimate in safe spaces inevitably harbours a snake.
  • We have seen the enemy, after all, and he is us. The snake inhabits each of our souls.
  • The worst of all possible snakes is the eternal human proclivity for evil. The worst of all possible snakes is psychological, spiritual, personal, internal. No walls, however tall, will keep that out. Even if the fortress were thick enough, in principle, to keep everything bad whatsoever outside, it would immediately appear again within.
  • I have learned that these old stories contain nothing superfluous. Anything accidental—anything that does not serve the plot—has long been forgotten in the telling. As the Russian playwright Anton Chekhov advised, “If there is a rifle hanging on the wall in act one, it must be fired in the next act. Otherwise it has no
  • business being there.”50
  • Eve immediately shares the fruit with Adam. That makes him self-conscious. Little has changed. Women have been making men self-conscious since the beginning of time. They do this primarily by rejecting them—but they also do it by shaming them, if men do not take responsibility. Since women bear the primary burden of reproduction, it’s no wonder. It is very hard to see how it could be otherwise. But the capacity of women to shame men and render them self-conscious is still a primal force of nature.
  • What does it mean to know yourself naked
  • Naked means vulnerable and easily damaged. Naked means subject to judgment for beauty and health. Naked means unprotected and unarmed in the jungle of nature and man. This is why Adam and Eve became ashamed, immediately after their eyes were opened. They could see—and what they first saw was themselves.
  • In their vulnerability, now fully realized, they felt unworthy to stand before God.
  • Beauty shames the ugly. Strength shames the weak. Death shames the living—and the Ideal shames us all.
  • He tells the woman that she will now bring forth children in sorrow, and desire an unworthy, sometimes resentful man, who will in consequence lord her biological fate over her, permanently. What might this mean? It could just mean that God is a patriarchal tyrant, as politically motivated interpretations of the ancient story insist. I think it’s merely descriptive.
  • women pay a high price for pregnancy and child-rearing, particularly in the early stages, and that one of the inevitable consequences is increased dependence upon the sometimes unreliable and always problematic good graces of men.
  • then God banishes the first man and the first woman from Paradise, out of infancy, out of the unconscious animal world, into the horrors of history itself. And then He puts cherubim and a flaming sword at the gate of Eden, just to stop them from eating the Fruit of the Tree of Life.
  • Perhaps Heaven is something you must build, and immortality something you must earn.
  • so we return to our original query: Why would someone buy prescription medication for his dog, and then so carefully administer it, when he would not do the same for himself?
  • Why should anyone take care of anything as naked, ugly, ashamed, frightened, worthless, cowardly, resentful, defensive and accusatory as a descendant of Adam? Even if that thing, that being, is himself?
  • We know how we are naked, and how that nakedness can be exploited—and that means we know how others are naked, and how they can be exploited. We can terrify other people, consciously. We can hurt and humiliate them for faults we understand only too well. We can torture them—literally—slowly, artfully and terribly. That’s far more than predation. That’s a qualitative shift in understanding. That’s a cataclysm as large as the development of self-consciousness itself. That’s the entry of the knowledge of Good and Evil into the world.
  • Only man could conceive of the rack, the iron maiden and the thumbscrew. Only man will inflict suffering for the sake of suffering. That is the best definition of evil I have been able to formulate.
  • with this realization we have well-nigh full legitimization of the idea, very unpopular in modern intellectual circles, of Original Sin.
  • Human beings have a great capacity for wrongdoing. It’s an attribute that is unique in the world of life. We can and do make things worse, voluntarily, with full knowledge of what we are doing (as well as accidentally, and carelessly, and in a manner that is willfully blind). Given that terrible capacity, that proclivity for malevolent actions, is it any wonder we have a hard time taking care of ourselves, or others—or even that we doubt the value of the entire human enterprise?
  • The juxtaposition of Genesis 1 with Genesis 2 & 3 (the latter two chapters outlining the fall of man, describing why our lot is so tragedy-ridden and ethically torturous) produces a narrative sequence almost unbearable in its profundity. The moral of Genesis 1 is that Being brought into existence through true speech is Good.
  • The original Man and Woman, existing in unbroken unity with their Creator, did not appear conscious (and certainly not self-conscious). Their eyes were not open. But, in their perfection, they were also less, not more, than their post-Fall counterparts. Their goodness was something bestowed, rather than deserved or earned.
  • Maybe, even in some cosmic sense (assuming that consciousness itself is a phenomenon of cosmic significance), free choice matters.
  • here’s a proposition: perhaps it is not simply the emergence of self-consciousness and the rise of our moral knowledge of Death and the Fall that besets us and makes us doubt our own worth. Perhaps it is instead our unwillingness—reflected in Adam’s shamed hiding—to walk with God, despite our fragility and propensity for evil.
  • The entire Bible is structured so that everything after the Fall—the history of Israel, the prophets, the coming of Christ—is presented as a remedy for that Fall, a way out of evil. The beginning of conscious history, the rise of the state and all its pathologies of pride and rigidity, the emergence of great moral figures who try to set things right, culminating in the Messiah Himself—that is all part of humanity’s attempt, God willing, to set itself right. And what would that mean?
  • And this is an amazing thing: the answer is already implicit in Genesis 1: to embody the Image of God—to speak out of chaos the Being that is Good—but to do so consciously, of our own free choice.
  • Back is the way forward—as T. S. Eliot so rightly insisted
  • We shall not cease from exploration And the end of all our exploring Will be to arrive where we started And know the place for the first time.
  • If we wish to take care of ourselves properly, we would have to respect ourselves—but we don’t, because we are—not least in our own eyes—fallen creatures.
  • If we lived in Truth; if we spoke the Truth—then we could walk with God once again, and respect ourselves, and others, and the world. Then we might treat ourselves like people we cared for.
  • We might strive to set the world straight. We might orient it toward Heaven, where we would want people we cared for to dwell, instead of Hell, where our resentment and hatred would eternally sentence everyone.
  • Then, the primary moral issue confronting society was control of violent, impulsive selfishness and the mindless greed and brutality that accompanies it.
  • It is easy to believe that people are arrogant, and egotistical, and always looking out for themselves. The cynicism that makes that opinion a universal truism is widespread and fashionable.
  • But such an orientation to the world is not at all characteristic of many people. They have the opposite problem: they shoulder intolerable burdens of self-disgust, self-contempt, shame and self-consciousness. Thus, instead of narcissistically inflating their own importance, they don’t value themselves at all, and they don’t take care of themselves with attention and skill.
  • Christ’s archetypal death exists as an example of how to accept finitude, betrayal and tyranny heroically—how to walk with God despite the tragedy of self-conscious knowledge—and not as a directive to victimize ourselves in the service of others.
  • To sacrifice ourselves to God (to the highest good, if you like) does not mean to suffer silently and willingly when some person or organization demands more from us, consistently, than is offered in return. That means we are supporting tyranny, and allowing ourselves to be treated like slaves.
  • I learned two very important lessons from Carl Jung, the famous Swiss depth psychologist, about “doing unto others as you would have them do unto you” or “loving your neighbour as yourself.”
  • The first lesson was that neither of these statements has anything to do with being nice. The second was that both are equations, rather than injunctions.
  • If I am someone’s friend, family member, or lover, then I am morally obliged to bargain as hard on my own behalf as they are on theirs.
  • there is little difference between standing up and speaking for yourself, when you are being bullied or otherwise tormented and enslaved, and standing up and speaking for someone else.
  • you do not simply belong to yourself. You are not simply your own possession to torture and mistreat. This is partly because your Being is inexorably tied up with that of others, and your mistreatment of yourself can have catastrophic consequences for others.
  • metaphorically speaking, there is also this: you have a spark of the divine in you, which belongs not to you, but to God. We are, after all—according to Genesis—made in His image.
  • We can make order from chaos—and vice versa—in our way, with our words. So, we may not exactly be God, but we’re not exactly nothing, either.
  • In my own periods of darkness, in the underworld of the soul, I find myself frequently overcome and amazed by the ability of people to befriend each other, to love their intimate partners and parents and children, and to do what they must do to keep the machinery of the world running.
  • It is this sympathy that should be the proper medicament for self-conscious self-contempt, which has its justification, but is only half the full and proper story. Hatred for self and mankind must be balanced with gratefulness for tradition and the state and astonishment at what normal, everyday people accomplish
  • You have some vital role to play in the unfolding destiny of the world. You are, therefore, morally obliged to take care of yourself.
  • To treat yourself as if you were someone you are responsible for helping is, instead, to consider what would be truly good for you. This is not “what you want.” It is also not “what would make you happy.”
  • You must help a child become a virtuous, responsible, awake being, capable of full reciprocity—able to take care of himself and others, and to thrive while doing so. Why would you think it acceptable to do anything less for yourself?
  • You need to know who you are, so that you understand your armament and bolster yourself in respect to your limitations. You need to know where you are going, so that you can limit the extent of chaos in your life, restructure order, and bring the divine force of Hope to bear on the world.
  • You need to determine how to act toward yourself so that you are most likely to become and to stay a good person.
  • Don’t underestimate the power of vision and direction. These are irresistible forces, able to transform what might appear to be unconquerable obstacles into traversable pathways and expanding opportunities.
  • Once having understood Hell, researched it, so to speak—particularly your
  • own individual Hell—you could decide against going there or creating that.
  • You could, in fact, devote your life to this. That would give you a Meaning, with a capital M. That would justify your miserable existence.
  • That would atone for your sinful nature, and replace your shame and self-consciousness with the natural pride and forthright confidence of someone who has learned once again to walk with God in the Garden.
  • RULE 3   MAKE FRIENDS WITH PEOPLE WHO WANT THE BEST FOR YOU
  • It would be more romantic, I suppose, to suggest that we would have all jumped at the chance for something more productive, bored out of our skulls as we were. But it’s not true. We were all too prematurely cynical and world-weary and leery of responsibility to stick to the debating clubs and Air Cadets and school sports that the adults around us tried to organize. Doing anything wasn’t cool.
  • When you move, everything is up in the air, at least for a while. It’s stressful, but in the chaos there are new possibilities. People, including you, can’t hem you in with their old notions. You get shaken out of your ruts. You can make new, better ruts, with people aiming at better things. I thought this was just a natural development. I thought that every person who moved would have—and want—the same phoenix-like experience.
  • What was it that made Chris and Carl and Ed unable (or, worse, perhaps, unwilling) to move or to change their friendships and improve the circumstances of their lives? Was it inevitable—a consequence of their own limitations, nascent illnesses and traumas of the past?
  • Why did he—like his cousin, like my other friends—continually choose people who, and places that, were not good for him?
  • perhaps, they don’t want the trouble of better. Freud called this a “repetition compulsion.” He thought of it as an unconscious drive to repeat the horrors of the past
  • People create their worlds with the tools they have directly at hand. Faulty tools produce faulty results. Repeated use of the same faulty tools produces the same faulty results.
  • It is in this manner that those who fail to learn from the past doom themselves to repeat it. It’s partly fate. It’s partly inability. It’s partly…unwillingness to learn? Refusal to learn? Motivated refusal to learn?
  • People choose friends who aren’t good for them for other reasons, too. Sometimes it’s because they want to rescue someone.
  • it is not easy to distinguish between someone truly wanting and needing help and someone who is merely exploiting a willing helper. The distinction is difficult even for the person who is wanting and needing and possibly exploiting.
  • When it’s not just naïveté, the attempt to rescue someone is often fuelled by vanity and narcissism.
  • But Christ himself, you might object, befriended tax-collectors and prostitutes. How dare I cast aspersions on the motives of those who are trying to help? But Christ was the archetypal perfect man. And you’re you.
  • How do you know that your attempts to pull someone up won’t instead bring them—or you—further down?
  • The same thing happens when well-meaning counsellors place a delinquent teen among comparatively civilized peers. The delinquency spreads, not the stability.65 Down is a lot easier than up.
  • maybe you’re saving someone because you want to convince yourself that the strength of your character is more than just a side effect of your luck and birthplace. Or maybe it’s because it’s easier to look virtuous when standing alongside someone utterly irresponsible.
  • Or maybe you have no plan, genuine or otherwise, to rescue anybody. You’re associating with people who are bad for you not because it’s better for anyone, but because it’s easier.
  • You know it. Your friends know it. You’re all bound by an implicit contract—one aimed at nihilism, and failure, and suffering of the stupidest sort.
  • Before you help someone, you should find out why that person is in trouble. You shouldn’t merely assume that he or she is a noble victim of unjust circumstances and exploitation. It’s the most unlikely explanation, not the most probable.
  • Besides, if you buy the story that everything terrible just happened on its own, with no personal responsibility on the part of the victim, you deny that person all agency in the past (and, by implication, in the present and future, as well).
  • It is far more likely that a given individual has just decided to reject the path upward, because of its difficulty. Perhaps that should even be your default assumption, when faced with such a situation.
  • failure is easy to understand. No explanation for its existence is required. In the same manner, fear, hatred, addiction, promiscuity, betrayal and deception require no explanation. It’s not the existence of vice, or the indulgence in it, that requires explanation. Vice is easy.
  • Failure is easy, too. It’s easier not to shoulder a burden. It’s easier not to think, and not to do, and not to care. It’s easier to put off until tomorrow what needs to be done today,
  • Success: that’s the mystery. Virtue: that’s what’s inexplicable. To fail, you merely have to cultivate a few bad habits. You just have to bide your time. And once someone has spent enough time cultivating bad habits and biding their time, they are much diminished.
  • I am not saying that there is no hope of redemption. But it is much harder to extract someone
  • from a chasm than to lift him from a ditch. And some chasms are very deep. And there’s not much left of the body at the bottom.
  • Carl Rogers, the famous humanistic psychologist, believed it was impossible to start a therapeutic relationship if the person seeking help did not want to improve.67 Rogers believed it was impossible to convince someone to change for the better. The
  • none of this is a justification for abandoning those in real need to pursue your narrow, blind ambition, in case it has to be said.
  • Here’s something to consider: If you have a friend whose friendship you wouldn’t recommend to your sister, or your father, or your son, why would you have such a friend for yourself?
  • You are not morally obliged to support someone who is making the world a worse place. Quite the opposite. You should choose people who want things to be better, not worse. It’s a good thing, not a selfish thing, to choose people who are good for you.
  • It is for this reason that every good example is a fateful challenge, and every hero, a judge. Michelangelo’s great perfect marble David cries out to its observer: “You could be more than you are.”
  • Don’t think that it is easier to surround yourself with good healthy people than with bad unhealthy people. It’s not. A good, healthy person is an ideal. It requires strength and daring to stand up near such a person.
  • RULE 4   COMPARE YOURSELF TO WHO YOU WERE YESTERDAY, NOT TO WHO SOMEONE ELSE IS TODAY
  • IT WAS EASIER FOR PEOPLE to be good at something when more of us lived in small, rural communities. Someone could be homecoming queen. Someone else could be spelling-bee champ, math whiz or basketball star. There were only one or two mechanics and a couple of teachers. In each of their domains, these local heroes had the opportunity to enjoy the serotonin-fuelled confidence of the victor.
  • Our hierarchies of accomplishment are now dizzyingly vertical.
  • No matter how good you are at something, or how you rank your accomplishments, there is someone out there who makes you look incompetent.
  • We are not equal in ability or outcome, and never will be. A very small number of people produce very much of everything.
  • People are unhappy at the bottom. They get sick there, and remain unknown and unloved. They waste their lives there. They die there. In consequence, the self-denigrating voice in the minds of people weaves a devastating tale. Life is a zero-sum game. Worthlessness is the default condition.
  • It is for such reasons that a whole generation of social psychologists recommended “positive illusions” as the only reliable route to mental health.69 Their credo? Let a lie be your umbrella. A more dismal, wretched, pessimistic philosophy can hardly be imagined:
  • Here is an alternative approach (and one that requires no illusions). If the cards are always stacked against you, perhaps the game you are playing is somehow rigged (perhaps by you, unbeknownst to yourself). If the internal voice makes you doubt the value of your endeavours—or your life, or life itself—perhaps you should stop listening.
  • There will always be people better than you—that’s a cliché of nihilism, like the phrase, In a million years, who’s going to know the difference? The proper response to that statement is not, Well, then, everything is meaningless. It’s, Any idiot can choose a frame of time within which nothing matters.
  • Standards of better or worse are not illusory or unnecessary. If you hadn’t decided that what you are doing right now was better than the alternatives, you wouldn’t be doing it. The idea of a value-free choice is a contradiction in terms. Value judgments are a precondition for action.
  • Furthermore, every activity, once chosen, comes with its own internal standards of accomplishment. If something can be done at all, it can be done better or worse. To do anything at all is therefore to play a game with a defined and valued end, which can always be reached more or less efficiently and elegantly.
  • We might start by considering the all-too-black-and-white words themselves: “success” or “failure.” You are either a success, a comprehensive, singular, over-all good thing, or its opposite, a failure, a comprehensive, singular, irredeemably bad thing.
  • There are vital degrees and gradations of value obliterated by this binary system, and the consequences are not good.
  • there is not just one game at which to succeed or fail. There are many games and, more specifically, many good games—
  • if changing games does not work, you can invent a new one. I
  • and athletic pursuits. You might consider judging your success across all the games you play.
  • When we are very young we are neither individual nor informed. We have not had the time nor gained the wisdom to develop our own standards. In consequence, we must compare ourselves to others, because standards are necessary.
  • As we mature we become, by contrast, increasingly individual and unique. The conditions of our lives become more and more personal and less and less comparable with those of others. Symbolically speaking, this means we must leave the house ruled by our father, and confront the chaos of our individual Being.
  • We must then rediscover the values of our culture—veiled from us by our ignorance, hidden in the dusty treasure-trove of the past—rescue them, and integrate them into our own lives. This is what gives existence its full and necessary meaning.
  • What is it that you actually love? What is it that you genuinely want? Before you can articulate your own standards of value, you must see yourself as a stranger—and then you must get to know yourself. What
  • Dare to be truthful. Dare to articulate yourself, and express (or at least become aware of) what would really justify your life.
  • Consult your resentment. It’s a revelatory emotion, for all its pathology. It’s part of an evil triad: arrogance, deceit, and resentment. Nothing causes more harm than this underworld Trinity. But resentment always means one of two things. Either the resentful person is immature, in which case he or she should shut up, quit whining, and get on with it, or there is tyranny afoot—in which case the person subjugated has a moral obligation to speak up.
  • Be cautious when you’re comparing yourself to others. You’re a singular being, once you’re an adult. You have your own particular, specific problems—financial, intimate, psychological, and otherwise.
  • Those are embedded in the unique broader context of your existence. Your career or job works for you in a personal manner, or it does not, and it does so in a unique interplay with the other specifics of your life.
  • We must see, but to see, we must aim, so we are always aiming. Our minds are built on the hunting-and-gathering platforms of our bodies. To hunt is to specify a target, track it, and throw at it.
  • We live within a framework that defines the present as eternally lacking and the future as eternally better. If we did not see things this way, we would not act at all. We wouldn’t even be able to see, because to see we must focus, and to focus we must pick one thing above all else on which to focus.
  • The disadvantage to all this foresight and creativity is chronic unease and discomfort. Because we always contrast what is with what could be, we have to aim at what could be.
  • The present is eternally flawed. But where you start might not be as important as the direction you are heading. Perhaps happiness is always to be found in the journey uphill, and not in the fleeting sense of satisfaction awaiting at the next peak.
  • Called upon properly, the internal critic will suggest something to set in order, which you could set in order, which you would set in order—voluntarily, without resentment, even with pleasure.
  • “Excuse me,” you might say to yourself, without irony or sarcasm. “I’m trying to reduce some of the unnecessary suffering around here. I could use some help.” Keep the derision at bay. “I’m wondering if there is anything that you would be willing to do? I’d be very grateful for your service.” Ask honestly and with humility. That’s no simple matter.
Javier E

Why America's Institutions Are Failing - The Atlantic - 0 views

  • “The government agencies we thought were keeping us safe and secure—the CDC, the FDA, the Police—have either failed or, worse, have been revealed to be active creators of danger and insecurity,” Alex Tabarrok, an economics professor at George Mason University, wrote on Twitter.
  • Why have America’s instruments of hard and soft power failed so spectacularly in 2020?
  • We are prepared for wars against states and militant groups, but not against stateless forces such as pandemics and climate change.
  • ...32 more annotations...
  • our risk sensor is fixed to the anxieties and illusions of the 1900s
  • We’re arming and empowering the police like it’s 1990, when urban crime had reached historic highs. But violent-crime rates have fallen by more than 50 percent in almost every major American city in the past generation, while police still drape themselves in military gear and kill more than 1,000 people annually.
  • Too many police are instructed to believe that the 20th-century crime wave never ended.
  • Between the 1960s and the early 1990s, the violent-crime rate in many U.S. cities rose “to levels seen only in the most violent, war-torn nations of the developing world,”
  • As American cities became perceived as war zones, police responded by adopting a “warrior” mentality.
  • Then violent crime plunged by more than 70 percent from 1993 to 2018, according to data maintained by the Department of Justice. Although officers routinely face threats that most white-collar workers never will, cops are safer now than at any point in nearly 50 years.
  • calls the idealization of the warrior “the most problematic aspect of modern [police] policy.”
  • The message is clear: Be a warrior, because it’s a war out there.
  • The warrior mentality encourages an adversarial approach in which officers needlessly escalate encounters.
  • The U.S. has about the same number of police officers per capita as, say, Australia; but adjusted for population, U.S. law enforcement kills 10 times more people.
  • the CDC had waited “its entire existence for this moment,” but it was so unprepared to deal with COVID-19 that the group initially in charge of the response, the Division of Viral Diseases, had to cede responsibilities to the Influenza Division, despite the fact that COVID-19 is not caused by any kind of influenza virus
  • Police aren’t just trained to feel like warriors; many are armed for war
  • Over time, SWAT itself served as a gateway drug for police militarization, as equipment once reserved for special teams, such as AR-15 rifles, were made available to ordinary officers.
  • the War on Drugs has been roundly discredited as a trillion-dollar failure that incurs thousands of unnecessary deaths. But it has bequeathed us a world where police bearing semiautomatics are armed with the wrong tools for the actual job
  • Violent crime plays a minuscule role in the day of a modern officer, who spends most of his or her time driving around, taking ho-hum radio calls, and performing the tertiary duties of traffic patroller and mental-health counselor.
  • the U.S. mental-health crisis has been effectively outsourced to the streets, where police who aren’t trained as social workers or behavioral therapists must perform the ad hoc duties of both.
  • Rather than respond to the drastically changing nature of American life, our cities and counties use police as a civic Swiss Army knife to solve problems such as homelessness and mental-health emergencies that have little to do with police training.
  • the failures of American police are not unique, but rather a symptom of a broader breakdown in high-quality governance.
  • Before it stood for the Centers for Disease Control and Prevention, the CDC was founded as the Communicable Disease Center in the 1940s. Its original mission was to stop an epidemic. The organization’s first 400, Atlanta-based employees were tasked with arresting an outbreak of malaria in the Southeast
  • Today, the center’s 14,000 employees work “at the speed of science”—that is, slowly and deliberately—to understand an array of health issues, including cancer, obesity, and vaping.
  • its mission creep has transformed what was once a narrowly focused agency into a kaleidoscopic bureaucracy with no fast-twitch instinct for achieving its founding mission to protect Americans from an epidemic.
  • The CDC’s recent failures are well known, but worth repeating. It failed to keep track of early COVID-19 cases in part because of a leaden-footed reliance on fax machines and other outdated record-keeping technology. It failed to compile accurate case counts, forcing private actors—such as The Atlantic’s COVID Tracking Project—to fill the void. It failed its most basic coordination functions as an agency
  • “The world has changed dramatically since the most violent years of the 1990s, but police training trails lived experience,”
  • Most important, the CDC failed to manufacture basic testing equipment. Its initial test kits were contaminated and unusable, which allowed the disease to spread undetected throughout the U.S. for weeks.
  • Compare the situation in the U.S. with the one in East Asia, where several countries have navigated the pandemic far more deftly. China, South Korea, Taiwan, Singapore, and Vietnam all updated their infectious-disease protocols based on what they learned from 21st-century epidemics: SARS in 2003, H1N1 in 2009, and MERS in 2012. These countries quickly understood what artillery would be necessary to take on COVID-19, including masks, tests, tracing, and quarantine spaces. Yet the CDC—armed with an $8 billion budget and a global team of scientists and officials—was somehow unprepared to read from the playbook.
  • The FDA fumbled just as tragically. In January, Alex Greninger, a virologist at the University of Washington, was prepared to build an in-house coronavirus test
  • By the time Greninger was ready to set up his lab, the calendar had turned to March. Hundreds of thousands of Americans were sick, and the outbreak was uncontrollable.
  • the White House cannot be entirely blamed for the ponderous incompetence of what ought to be the greatest public-health system in the world.
  • Not every American institution is trapped in amber. For a perhaps surprising example of one that has adapted to 21st-century needs, take the Federal Reserve.
  • Ben Bernanke, the Fed chair during the Great Recession, used his expertise about the 1930s economy to avoid a similar collapse in financial markets in 2008. Today’s Fed chair, Jerome Powell, has gone even further, urging Congress and the Treasury to “think big” and add to our already-historic deficits.
  • the Federal Reserve has junked old shibboleths about inflation and deficit spending and embraced a policy that might have scandalized mainstream economists in the 1990s. Rejecting the status-quo bias that plagues so many institutions, this 106-year-old is still changing with the world.
  • what strikes me is that America’s safekeeping institutions have forgotten how to properly see the threats of the 21st century and move quickly to respond to them. Those who deny history may be doomed to repeat it. But those who deny the present are just doomed
Javier E

Why Britain Failed Its Coronavirus Test - The Atlantic - 0 views

  • Britain has not been alone in its failure to prevent mass casualties—almost every country on the Continent suffered appalling losses—but one cannot avoid the grim reality spelled out in the numbers: If almost all countries failed, then Britain failed more than most.
  • The raw figures are grim. Britain has the worst overall COVID-19 death toll in Europe, with more than 46,000 dead according to official figures, while also suffering the Continent’s second-worst “excess death” tally per capita, more than double that in France and eight times higher than Germany’s
  • The British government as a whole made poorer decisions, based on poorer advice, founded on poorer evidence, supplied by poorer testing, with the inevitable consequence that it achieved poorer results than almost any of its peers. It failed in its preparation, its diagnosis, and its treatment.
  • ...38 more annotations...
  • In the past two decades, the list of British calamities, policy misjudgments, and forecasting failures has been eye-watering: the disaster of Iraq, the botched Libyan intervention in 2011, the near miss of Scottish independence in 2014, the woeful handling of Britain’s divorce from the European Union from 2016 onward
  • What emerges is a picture of a country whose systemic weaknesses were exposed with appalling brutality, a country that believed it was stronger than it was, and that paid the price for failures that have built up for years
  • The most difficult question about all this is also the simplest: Why?
  • The human immune system actually has two parts. There is, as Cummings correctly identifies, the adaptive part. But there is also an innate part, preprogrammed as the first line of defense against infectious disease. Humans need both. The same is true of a state and its government, said those I spoke with—many of whom were sympathetic to Cummings’s diagnosis. Without a functioning structure, the responsive antibodies of the government and its agencies cannot learn on the job. When the pandemic hit, both parts of Britain’s immune system were found wanting.
  • Britain’s pandemic story is not all bad. The NHS is almost universally seen as having risen to the challenge; the University of Oxford is leading the race to develop the first coronavirus vaccine for international distribution, backed with timely and significant government cash; new hospitals were built and treatments discovered with extraordinary speed; the welfare system did not collapse, despite the enormous pressure it suddenly faced; and a national economic safety net was rolled out quickly.
  • One influential U.K. government official told me that although individual mistakes always happen in a fast-moving crisis, and had clearly taken place in Britain’s response to COVID-19, it was impossible to escape the conclusion that Britain was simply not ready. As Ian Boyd, a professor and member of SAGE, put it: “The reality is, there has been a major systemic failure.”
  • “It’s obvious that the British state was not prepared for” the pandemic, this official told me. “But, even worse, many parts of the state thought they were prepared, which is significantly more dangerous.”
  • When the crisis came, too much of Britain’s core infrastructure simply failed, according to senior officials and experts involved in the pandemic response
  • Like much of the Western world, Britain had prepared for an influenza pandemic, whereas places that were hit early—Hong Kong, South Korea, Singapore, Taiwan—had readied themselves for the type of respiratory illness that COVID-19 proved to be.
  • The consequences may be serious and long term, but the most immediately tragic effect was that creating space in hospitals appears to have been prioritized over shielding Britain’s elderly, many of whom were moved to care homes, part of what Britain calls the social-care sector, where the disease then spread. Some 25,000 patients were discharged into these care homes between March 17 and April 16, many without a requirement that they secure a negative coronavirus test beforehand.
  • There was a bit too much exceptionalism about how brilliant British science was at the start of this outbreak, which ended up with a blind spot about what was happening in Korea, Taiwan, Singapore, where we just weren’t looking closely enough, and they turned out to be the best in the world at tackling the coronavirus,” a former British cabinet minister told me.
  • The focus on influenza pandemics and the lack of a tracing system were compounded by a shortfall in testing capacity.
  • Johnson’s strategy throughout was one that his hero Winston Churchill raged against during the First World War, when he concluded that generals had been given too much power by politicians. In the Second World War, Churchill, by then prime minister and defense secretary, argued that “at the summit, true politics and strategy are one.” Johnson did not take this approach, succumbing—as his detractors would have it—to fatalistic management rather than bold leadership, empowering the generals rather than taking responsibility himself
  • “It was a mixture of poor advice and fatalism on behalf of the experts,” one former colleague of Johnson’s told me, “and complacency and boosterism on behalf of the PM.”
  • What it all adds up to, then, is a sobering reality: Institutional weaknesses of state capacity and advice were not corrected by political judgment, and political weaknesses were not corrected by institutional strength. The system was hardwired for a crisis that did not come, and could not adapt quickly enough to the one that did.
  • Britain’s NHS has come to represent the country itself, its sense of identity and what it stands for. Set up in 1948, it became known as the first universal health-care system of any major country in the world (although in reality New Zealand got there first). Its creation, three years after victory in the Second World War, was a high-water mark in the country’s power and prestige—a time when it was a global leader, an exception.
  • Every developed country in the world, apart from the United States, has a universal health-care system, many of which produce better results than the NHS.
  • Yet from its beginnings, the NHS has occupied a unique hold on British life. It is routinely among the most trusted institutions in the country. Its key tenet—that all Britons will have access to health care, free at the point of service—symbolizes an aspirational egalitarianism that, even as inequality has risen since the Margaret Thatcher era, remains at the core of British identity.
  • In asking the country to rally to the NHS’s defense, Johnson was triggering its sense of self, its sense of pride and national unity—its sense of exceptionalism.
  • Before the coronavirus, the NHS was already under considerable financial pressure. Waiting times for appointments were rising, and the country had one of the lowest levels of spare intensive-care capacity in Europe. In 2017, Simon Stevens, the NHS’s chief executive, compared the situation to the time of the health sevice’s founding decades prior: an “economy in disarray, the end of empire, a nation negotiating its place in the world.”
  • When the pandemic hit, then, Britain was not the strong, successful, resilient country it imagined, but a poorly governed and fragile one. The truth is, Britain was sick before it caught the coronavirus.
  • In effect, Britain was rigorously building capacity to help the NHS cope, but releasing potentially infected elderly, and vulnerable, patients in the process. By late June, more than 19,000 people had died in care homes from COVID-19. Separate excess-death data suggest that the figure may be considerably higher
  • Britain failed to foresee the dangers of such an extraordinary rush to create hospital capacity, a shift that was necessary only because of years of underfunding and decades of missed opportunities to bridge the divide between the NHS and retirement homes, which other countries, such as Germany, had found the political will to do.
  • Ultimately, the scandal is a consequence of a political culture that has proved unable to confront and address long-term problems, even when they are well known.
  • other health systems, such as Germany’s, which is better funded and decentralized, performed better than Britain’s. Those I spoke with who either are in Germany or know about Germany’s success told me there was an element of luck about the disparity with Britain. Germany had a greater industrial base to produce medical testing and personal protective equipment, and those who returned to Germany with the virus from abroad were often younger and healthier, meaning the initial strain on its health system was less.
  • However, this overlooks core structural issues—resulting from political choices in each country—that meant that Germany proved more resilient when the crisis came, whether because of the funding formula for its health system, which allows individuals more latitude to top up their coverage with private contributions, or its decentralized nature, which meant that separate regions and hospitals were better able to respond to local outbreaks and build their own testing network.
  • Also unlike Britain, which has ducked the problem of reforming elderly care, Germany created a system in 1995 that everyone pays into, avoids catastrophic costs, and has cross-party support.
  • A second, related revelation of the crisis—which also exposed the failure of the British state—is that underneath the apparent simplicity of the NHS’s single national model lies an engine of bewildering complexity, whose lines of responsibility, control, and accountability are unintelligible to voters and even to most politicians.
  • Britain, I was told, has found a way to be simultaneously overcentralized and weak at its center. The pandemic revealed the British state’s inability to manage the nation’s health:
  • Since at least the 1970s, growing inequality between comparatively rich southeast England (including London) and the rest of the country has spurred all parties to pledge to “rebalance the economy” and make it less reliant on the capital. Yet large parts remain poorer than the European average. According to official EU figures, Britain has five regions with a per capita gross domestic product of less than $25,000. France, Germany, Ireland, Austria, the Netherlands, Denmark, and Sweden have none
  • If Britain were part of the United States, it would be anywhere from the third- to the eighth-poorest state, depending on the measure.
  • Britain’s performance in this crisis has been so bad, it is damaging the country’s reputation, both at home and abroad.
  • Inside Downing Street, officials believe that the lessons of the pandemic apply far beyond the immediate confines of elderly care and coronavirus testing, taking in Britain’s long-term economic failures and general governance, as well as what they regard as its ineffective foreign policy and diplomacy.
  • the scale of the task itself is enormous. “We need a complete revamp of our government structure because it’s not fit for purpose anymore,” Boyd told me. “I just don’t know if we really understand our weakness.”
  • In practice, does Johnson have the confidence to match his diagnosis of Britain’s ills, given the timidity of his approach during the pandemic? The nagging worry among even Johnson’s supporters in Parliament is that although he may campaign as a Ronald Reagan, he might govern as a Silvio Berlusconi, failing to solve the structural problems he has identified.
  • This is not a story of pessimistic fatalism, of inevitable decline. Britain was able to partially reverse a previous slump in the 1980s, and Germany, seen as a European laggard in the ‘90s, is now the West’s obvious success story. One of the strengths of the Westminster parliamentary system is that it occasionally produces governments—like Johnson’s—with real power to effect change, should they try to enact it.
  • It has been overtaken by many of its rivals, whether in terms of health provision or economic resilience, but does not seem to realize it. And once the pandemic passes, the problems Britain faces will remain: how to sustain institutions so that they bind the country together, not pull it apart; how to remain prosperous in the 21st century’s globalized economy; how to promote its interests and values; how to pay for the ever-increasing costs of an aging population.
  • “The really important question,” Boyd said, “is whether the state, in its current form, is structurally capable of delivering on the big-picture items that are coming, whether pandemics or climate change or anything else.”
Javier E

John Adams' Fear Has Come to Pass - by David French - 0 views

  • When I try to explain the aspirational genius of the American founding, I always refer to two documents
  • They’re by the famous “frenemies” of the American founding, Thomas Jefferson and John Adams.
  • Jefferson’s Declaration of Independence. The second is Adams’s very short Letter to the Massachusetts Militia, dated October 11, 1798.
  • ...37 more annotations...
  • these documents define the American social compact—the mutual responsibilities of citizen and state—that define the American experiment.
  • Here’s the first pair, from the Declaration:
  • We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.--That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed.
  • The first sentence recognizes the inherent dignity of man as human beings created in the image of God. The second sentence, nearly as important, recognizes the unavoidable duty of government to recognize and protect that dignity. While the sole purpose of government isn’t to protect liberty, a government that fails to protect liberty fails in an essential function. 
  • Adams wrote to the officers of the First Brigade of the Third Division of the Militia of Massachusetts to outline the responsibilities of the citizens of the new republic.
  • The letter contains the famous declaration that “our Constitution was made only for a moral and religious People. It is wholly inadequate to the government of any other.” But I’m more interested in the two preceding sentences:
  • Because We have no Government armed with Power capable of contending with human Passions unbridled by morality and Religion. Avarice, Ambition, Revenge or Galantry, would break the strongest Cords of our Constitution as a Whale goes through a Net.
  • Put in plain English, this means that when public virtue fails, our constitutional government does not possess the power to preserve itself.
  • the American experiment depends upon both the government upholding its obligation to preserve liberty and the American people upholding theirs to exercise that liberty towards virtuous purposes. 
  • Citizen and state both have obligations, and if either side fails, it imperils the republic.
  • We see this reality play out in American history.
  • The seeds for the first great American crisis were sown in the original Constitution itself. By failing to end slavery and by failing to extend the Bill of Rights to protect citizens from the oppression of state and local governments, the early American government flatly failed to live up to the principles of the Declaration, and we paid the price in blood.
  • our nation seethes again today
  • The response to John Adams’s warning is not to arm the government with more power but to equip citizens with more virtue.
  • Its politics are gripped by deep hatred and abiding animosity, and its culture groans under the weight of human despair. Hatred rules our politics; anxiety, depression, and loneliness dominate our culture.
  • Those many cultural critics who look at the United States of America and declare that “something is wrong” are exactly right
  • here’s the difference—unlike the days when we could point to a specific source of government oppression, such as slavery or Jim Crow, the American government (though highly imperfect) currently protects individual liberty and associational freedoms to a degree we’ve never seen in American history.
  • Even after the Civil War, the quick end of Union occupation of the Confederacy enabled the creation of an apartheid substate in the South. Once again, the government failed to live up the core principles of the founding. It is by God’s grace that the Jim Crow regime ended primarily as the result of one of the Civil Rights Movement—one of the great Christian justice movements in history—and not as the result of another convulsive civil conflict.
  • But what can the government do about friendlessness? About anxiety? What can the government do to make sure that we are not—in Robert Putnam’s memorable phrase—“bowling alone?
  • that challenge is compounded by the fact that the most engaged American citizens are its most angry partisans.
  • And if you think that most-partisan cohort is seething with anger because they suffer from painful oppression, think again. The data is clear. As the More in Common project notes, the most polarized Americans are disproportionately white and college-educated on the left and disproportionately white and retired on the right. 
  • The people disproportionately driving polarization in the United States are not oppressed minorities, but rather some of the most powerful, most privileged, wealthiest people who’ve ever lived.
  • They enjoy more freedom and opportunity than virtually any prior generation of humans, all while living under the protective umbrella of the most powerful military in the history of the planet.
  • It’s simply an astonishing level of discontent in the midst of astonishing wealth and power.
  • maybe it’s not so astonishing, because accumulating wealth and power is not and never was the path to meaning and purpose.
  • much of both the right and left postliberal impulse is related to the first of John Adams’s two key sentences. If we don’t have a government “armed with Power capable of contending with human Passions unbridled by morality and Religion,” then their solution is to increase the power of government. Arm it with more power. 
  • But when it comes to government, you’re never arming an “it,” you’re arming a “them”—a collection of human beings who suffer from all the same character defects and cultural maladies as the rest of us
  • As James Madison observed in Federalist 51 (the second-best Federalist Paper), “If angels were to govern men, neither external nor internal controls on government would be necessary.” Yet American postliberalism asks us to empower men and women who frequently don’t even pretend to be virtuous, who often glory in their vice, all for the “common good.” 
  • We still battle the legacy of past injustice and the present reality of lingering discrimination, but there’s just no comparison between the legal systems that destabilized America and the legal systems that exist today. 
  • how do we do that? The path past animosity and against despair can be as short and simple as the path from Twitter to the kitchen table
  • It’s shifting the focus from the infuriating thing you can’t control to the people you can love, to the institutions you can build.
  • in this present time, thanks to the steadily-expanding sphere of American liberty, we have more ability to unite—including for religious purposes—than at any time in American history. Yet we still bowl alone. We tweet alone. We rage alone, staring at screens and forming online tribes that provide an empty simulacrum of real relationships.
  • for all too many of us that feels empty, like our small actions are simply inadequate to address the giant concerns that dominate our minds
  • To do the big thing—to heal our land—we have to do the small things.
  • We need a frame shift. Do not think of doing the small things as abandoning the larger quest. See every family, every friendship, every healthy church, every functioning school board as indispensable to our continued American experiment. 
  • For those who think and obsess about politics, this shift from big to small is hard. It’s hard to think that how you love your friends might be more important to our nation than what you think of CRT
  • When our crisis is one of hatred, anxiety, and despair, don’t look to politics to heal our hearts. Our government can’t contend with “human Passions unbridled by morality and Religion.” Our social fabric is fraying. The social compact is crumbling. Our government is imperfect, but if this republic fractures, its people will be to blame. 
Javier E

On gun violence, we are a failed state - The Washington Post - 0 views

  • The surest sign a political regime is failing is its inability to do anything about a problem universally seen as urgent that has some obvious remedies. And it’s a mark of political corruption when unaccountable cliques block solutions that enjoy broad support and force their selfish interests to prevail over the common good.
  • On gun violence, the United States has become a corrupt failed state.
  • Whatever happens, we can’t ban assault weapons, we can’t strengthen background checks, we can’t do anything.
  • ...6 more annotations...
  • Memo to the media: Stop saying in somber, serious tones that we must do more about mental health. This might well be true, but in the context of crimes such as those at Stoneman Douglas High, offering such sentiments is to be complicit in propaganda by pretending that a cover story is actually on the leve
  • In corrupt failed states, politics is about lying and misdirection. On guns, our debate is a pack of lies and evasions. In no other country is the phrase “thoughts and prayers” a sacrilege, a cover for cowardice.
  • In no other country are the words “mental health” so empty. They are muttered by politicians who have no history of caring in the least about programs to help those with psychological or psychiatric difficulties. But they need to say something to rationalize their allegiance to a gun lobby that appears to be utterly indifferent to mass murder.
  • We should not have to point out over and over that while mental illness exists everywhere, other countries do not have killing sprees comparable to ours.
  • At the heart of our political system’s failure to address the epidemic of violence is the Republican Party’s decision to become a paid agent of the gun manufacturers’ lobby. The party of law and order cares about neither if doing so means causing the least disturbance to the National Rifle Association.
  • Aggravating our difficulty in regulating weapons is the vast overrepresentation of rural states in the U.S. Senate, which makes some Democrats wary of taking on the NRA. This is another classic problem of failed regimes: Their structures are no longer capable of responding to current needs.
Javier E

Collapsing Levels of Trust Are Devastating America - The Atlantic - 0 views

  • American history is driven by periodic moments of moral convulsion
  • Harvard political scientist Samuel P. Huntington noticed that these convulsions seem to hit the United States every 60 years or so: the Revolutionary period of the 1760s and ’70s; the Jacksonian uprising of the 1820s and ’30s; the Progressive Era, which began in the 1890s; and the social-protest movements of the 1960s and early ’70s
  • A highly moralistic generation appears on the scene. It uses new modes of communication to seize control of the national conversation. Groups formerly outside of power rise up and take over the system. These are moments of agitation and excitement, frenzy and accusation, mobilization and passion.
  • ...168 more annotations...
  • In 1981, Huntington predicted that the next moral convulsion would hit America around the second or third decade of the 21st century—that is, right about now.
  • Trump is the final instrument of this crisis, but the conditions that brought him to power and make him so dangerous at this moment were decades in the making, and those conditions will not disappear if he is defeated.
  • Social trust is a measure of the moral quality of a society—of whether the people and institutions in it are trustworthy, whether they keep their promises and work for the common g
  • When people in a society lose faith or trust in their institutions and in each other, the nation collapses.
  • This is an account of how, over the past few decades, America became a more untrustworthy society
  • under the stresses of 2020, American institutions and the American social order crumbled and were revealed as more untrustworthy still
  • We had a chance, in crisis, to pull together as a nation and build trust. We did not. That has left us a broken, alienated society caught in a distrust doom loop.
  • The Baby Boomers grew up in the 1950s and ’60s, an era of family stability, widespread prosperity, and cultural cohesion. The mindset they embraced in the late ’60s and have embodied ever since was all about rebelling against authority, unshackling from institutions, and celebrating freedom, individualism, and liberation.
  • The emerging generations today enjoy none of that sense of security. They grew up in a world in which institutions failed, financial systems collapsed, and families were fragile. Children can now expect to have a lower quality of life than their parents, the pandemic rages, climate change looms, and social media is vicious. Their worldview is predicated on threat, not safety.
  • Thus the values of the Millennial and Gen Z generations that will dominate in the years ahead are the opposite of Boomer values: not liberation, but security; not freedom, but equality; not individualism, but the safety of the collective; not sink-or-swim meritocracy, but promotion on the basis of social justice
  • A new culture is dawning. The Age of Precarity is here.
  • I’ve spent my career rebutting the idea that America is in decline, but the events of these past six years, and especially of 2020, have made clear that we live in a broken nation. The cancer of distrust has spread to every vital organ.
  • Those were the days of triumphant globalization. Communism was falling. Apartheid was ending. The Arab-Israeli dispute was calming down. Europe was unifying. China was prospering. In the United States, a moderate Republican president, George H. W. Bush, gave way to the first Baby Boomer president, a moderate Democrat, Bill Clinton.
  • The stench of national decline is in the air. A political, social, and moral order is dissolving. America will only remain whole if we can build a new order in its place.
  • The American economy grew nicely. The racial wealth gap narrowed. All the great systems of society seemed to be working: capitalism, democracy, pluralism, diversity, globalization. It seemed, as Francis Fukuyama wrote in his famous “The End of History?” essay for The National Interest, “an unabashed victory for economic and political liberalism.”
  • Nations with low social trust—like Brazil, Morocco, and Zimbabwe—have struggling economies.
  • We think of the 1960s as the classic Boomer decade, but the false summer of the 1990s was the high-water mark of that ethos
  • The first great theme of that era was convergence. Walls were coming down. Everybody was coming together.
  • The second theme was the triumph of classical liberalism. Liberalism was not just a philosophy—it was a spirit and a zeitgeist, a faith that individual freedom would blossom in a loosely networked democratic capitalist world. Enterprise and creativity would be unleashed. America was the great embodiment and champion of this liberation.
  • The third theme was individualism. Society flourished when individuals were liberated from the shackles of society and the state, when they had the freedom to be true to themselves.
  • For his 2001 book, Moral Freedom, the political scientist Alan Wolfe interviewed a wide array of Americans. The moral culture he described was no longer based on mainline Protestantism, as it had been for generations
  • Instead, Americans, from urban bobos to suburban evangelicals, were living in a state of what he called moral freedom: the belief that life is best when each individual finds his or her own morality—inevitable in a society that insists on individual freedom.
  • moral freedom, like the other dominant values of the time, contained within it a core assumption: If everybody does their own thing, then everything will work out for everybody.
  • This was an ideology of maximum freedom and minimum sacrifice.
  • It all looks naive now. We were naive about what the globalized economy would do to the working class, naive to think the internet would bring us together, naive to think the global mixing of people would breed harmony, naive to think the privileged wouldn’t pull up the ladders of opportunity behind them
  • Over the 20 years after I sat with Kosieva, it all began to unravel. The global financial crisis had hit, the Middle East was being ripped apart by fanatics. On May 15, 2011, street revolts broke out in Spain, led by the self-declared Indignados—“the outraged.” “They don’t represent us!” they railed as an insult to the Spanish establishment. It would turn out to be the cry of a decade.
  • Millennials and members of Gen Z have grown up in the age of that disappointment, knowing nothing else. In the U.S. and elsewhere, this has produced a crisis of faith, across society but especially among the young. It has produced a crisis of trust.
  • Social trust is a generalized faith in the people of your community. It consists of smaller faiths. It begins with the assumption that we are interdependent, our destinies linked. It continues with the assumption that we share the same moral values. We share a sense of what is the right thing to do in different situations
  • gh-trust societies have what Fukuyama calls spontaneous sociability. People are able to organize more quickly, initiate action, and sacrifice for the common good.
  • When you look at research on social trust, you find all sorts of virtuous feedback loops. Trust produces good outcomes, which then produce more trust. In high-trust societies, corruption is lower and entrepreneurship is catalyzed.
  • Higher-trust nations have lower economic inequality, because people feel connected to each other and are willing to support a more generous welfare state.
  • People in high-trust societies are more civically engaged. Nations that score high in social trust—like the Netherlands, Sweden, China, and Australia—have rapidly growing or developed economies.
  • Renewal is hard to imagine. Destruction is everywhere, and construction difficult to see.
  • As the ethicist Sissela Bok once put it, “Whatever matters to human beings, trust is the atmosphere in which it thrives.”
  • During most of the 20th century, through depression and wars, Americans expressed high faith in their institutions
  • In 1964, for example, 77 percent of Americans said they trusted the federal government to do the right thing most or all of the time.
  • By 1994, only one in five Americans said they trusted government to do the right thing.
  • Then came the Iraq War and the financial crisis and the election of Donald Trump. Institutional trust levels remained pathetically low. What changed was the rise of a large group of people who were actively and poi
  • sonously alienated—who were not only distrustful but explosively distrustful. Explosive distrust is not just an absence of trust or a sense of detached alienation—it is an aggressive animosity and an urge to destroy. Explosive distrust is the belief that those who disagree with you are not just wrong but illegitimate
  • In 1997, 64 percent of Americans had a great or good deal of trust in the political competence of their fellow citizens; today only a third of Americans feel that way.
  • In most societies, interpersonal trust is stable over the decades. But for some—like Denmark, where about 75 percent say the people around them are trustworthy, and the Netherlands, where two-thirds say so—the numbers have actually risen.
  • In America, interpersonal trust is in catastrophic decline. In 2014, according to the General Social Survey conducted by NORC at the University of Chicago, only 30.3 percent of Americans agreed that “most people can be trusted,”
  • Today, a majority of Americans say they don’t trust other people when they first meet them.
  • There’s evidence to suggest that marital infidelity, academic cheating, and animal cruelty are all on the rise in America, but it’s hard to directly measure the overall moral condition of society—how honest people are, and how faithful.
  • Trust is the ratio between the number of people who betray you and the number of people who remain faithful to you. It’s not clear that there is more betrayal in America than there used to be—but there are certainly fewer faithful supports around people than there used to be.
  • Hundreds of books and studies on declining social capital and collapsing family structure demonstrate this. In the age of disappointment, people are less likely to be surrounded by faithful networks of people they can trust.
  • Black Americans have high trust in other Black Americans; it’s the wider society they don’t trust, for good and obvious reasons
  • As Vallier puts it, trust levels are a reflection of the moral condition of a nation at any given time.
  • high national trust is a collective moral achievement.
  • High national distrust is a sign that people have earned the right to be suspicious. Trust isn’t a virtue—it’s a measure of other people’s virtue.
  • Unsurprisingly, the groups with the lowest social trust in America are among the most marginalized.
  • Black Americans have been one of the most ill-treated groups in American history; their distrust is earned distrust
  • In 2018, 37.3 percent of white Americans felt that most people can be trusted, according to the General Social Survey, but only 15.3 percent of Black Americans felt the same.
  • People become trusting when the world around them is trustworthy. When they are surrounded by people who live up to their commitments. When they experience their country as a fair place.
  • In 2002, 43 percent of Black Americans were very or somewhat satisfied with the way Black people are treated in the U.S. By 2018, only 18 percent felt that way, according to Gallup.
  • The second disenfranchised low-trust group includes the lower-middle class and the working poor.
  • this group makes up about 40 percent of the country.
  • “They are driven by the insecurity of their place in society and in the economy,” he says. They are distrustful of technology and are much more likely to buy into conspiracy theories. “They’re often convinced by stories that someone is trying to trick them, that the world is against them,”
  • the third marginalized group that scores extremely high on social distrust: young adults. These are people who grew up in the age of disappointment. It’s the only world they know.
  • In 2012, 40 percent of Baby Boomers believed that most people can be trusted, as did 31 percent of members of Generation X. In contrast, only 19 percent of Millennials said most people can be trusted
  • Seventy-three percent of adults under 30 believe that “most of the time, people just look out for themselves,” according to a Pew survey from 2018. Seventy-one percent of those young adults say that most people “would try to take advantage of you if they got a chance.
  • A mere 10 percent of Gen Zers trust politicians to do the right thing.
  • Only 35 percent of young people, versus 67 percent of old people, believe that Americans respect the rights of people who are not like them.
  • Fewer than a third of Millennials say America is the greatest country in the world, compared to 64 percent of members of the Silent Generation.
  • “values and behavior are shaped by the degree to which survival is secure.” In the age of disappointment, our sense of safety went away
  • Some of this is physical insecurity: school shootings, terrorist attacks, police brutality, and overprotective parenting at home
  • the true insecurity is financial, social, and emotional.
  • By the time the Baby Boomers hit a median age of 35, their generation owned 21 percent of the nation’s wealth
  • First, financial insecurity
  • As of last year, Millennials—who will hit an average age of 35 in three years—owned just 3.2 percent of the nation’s wealth.
  • Next, emotional insecurity:
  • fewer children growing up in married two-parent households, more single-parent households, more depression, and higher suicide rates.
  • Then, identity insecurity.
  • All the traits that were once assigned to you by your community, you must now determine on your own: your identity, your morality, your gender, your vocation, your purpose, and the place of your belonging. Self-creation becomes a major anxiety-inducing act of young adulthood.
  • liquid modernity
  • Finally, social insecurity.
  • n the age of social media our “sociometers”—the antennae we use to measure how other people are seeing us—are up and on high alert all the time. Am I liked? Am I affirmed?
  • Danger is ever present. “For many people, it is impossible to think without simultaneously thinking about what other people would think about what you’re thinking,” the educator Fredrik deBoer has written. “This is exhausting and deeply unsatisfying. As long as your self-conception is tied up in your perception of other people’s conception of you, you will never be free to occupy a personality with confidence; you’re always at the mercy of the next person’s dim opinion of you and your whole deal.”
  • In this world, nothing seems safe; everything feels like chaos.
  • Distrust sows distrust. It produces the spiritual state that Emile Durkheim called anomie, a feeling of being disconnected from society, a feeling that the whole game is illegitimate, that you are invisible and not valued, a feeling that the only person you can really trust is yourself.
  • People plagued by distrust can start to see threats that aren’t there; they become risk averse
  • Americans take fewer risks and are much less entrepreneurial than they used to be. In 2014, the rate of business start-ups hit a nearly 40-year low. Since the early 1970s, the rate at which people move across state lines each year has dropped by 56 percent
  • People lose faith in experts. They lose faith in truth, in the flow of information that is the basis of modern society. “A world of truth is a world of trust, and vice versa,”
  • In periods of distrust, you get surges of populism; populism is the ideology of those who feel betrayed
  • People are drawn to leaders who use the language of menace and threat, who tell group-versus-group power narratives. You also get a lot more political extremism. People seek closed, rigid ideological systems that give them a sense of security.
  • fanaticism is a response to existential anxiety. When people feel naked and alone, they revert to tribe. Their radius of trust shrinks, and they only trust their own kind.
  • When many Americans see Trump’s distrust, they see a man who looks at the world as they do.
  • By February 2020, America was a land mired in distrust. Then the plague arrived.
  • From the start, the pandemic has hit the American mind with sledgehammer force. Anxiety and depression have spiked. In April, Gallup recorded a record drop in self-reported well-being, as the share of Americans who said they were thriving fell to the same low point as during the Great Recession
  • These kinds of drops tend to produce social upheavals. A similar drop was seen in Tunisian well-being just before the street protests that led to the Arab Spring.
  • The emotional crisis seems to have hit low-trust groups the hardest
  • “low trusters” were more nervous during the early months of the pandemic, more likely to have trouble sleeping, more likely to feel depressed, less likely to say the public authorities were responding well to the pandemic
  • Eighty-one percent of Americans under 30 reported feeling anxious, depressed, lonely, or hopeless at least one day in the previous week, compared to 48 percent of adults 60 and over.
  • Americans looked to their governing institutions to keep them safe. And nearly every one of their institutions betrayed them
  • The president downplayed the crisis, and his administration was a daily disaster area
  • The Centers for Disease Control and Prevention produced faulty tests, failed to provide up-to-date data on infections and deaths, and didn’t provide a trustworthy voice for a scared public.
  • The Food and Drug Administration wouldn’t allow private labs to produce their own tests without a lengthy approval process.
  • In nations that ranked high on the World Values Survey measure of interpersonal trust—like China, Australia, and most of the Nordic states—leaders were able to mobilize quickly, come up with a plan, and count on citizens to comply with the new rules.
  • In low-trust nations—like Mexico, Spain, and Brazil—there was less planning, less compliance, less collective action, and more death.
  • Countries that fell somewhere in the middle—including the U.S., Germany, and Japan—had a mixed record depending on the quality of their leadership.
  • South Korea, where more than 65 percent of people say they trust government when it comes to health care, was able to build a successful test-and-trace regime. In America, where only 31 percent of Republicans and 44 percent of Democrats say the government should be able to use cellphone data to track compliance with experts’ coronavirus social-contact guidelines, such a system was never really implemented.
  • For decades, researchers have been warning about institutional decay. Institutions get caught up in one of those negative feedback loops that are so common in a world of mistrust. They become ineffective and lose legitimacy. People who lose faith in them tend not to fund them. Talented people don’t go to work for them. They become more ineffective still.
  • On the right, this anti-institutional bias has manifested itself as hatred of government; an unwillingness to defer to expertise, authority, and basic science; and a reluctance to fund the civic infrastructure of society, such as a decent public health system
  • On the left, distrust of institutional authority has manifested as a series of checks on power that have given many small actors the power to stop common plans, producing what Fukuyama calls a vetocracy
  • In 2020, American institutions groaned and sputtered. Academics wrote up plan after plan and lobbed them onto the internet. Few of them went anywhere. America had lost the ability to build new civic structures to respond to ongoing crises like climate change, opioid addiction, and pandemics, or to reform existing ones.
  • In a lower-trust era like today, Levin told me, “there is a greater instinct to say, ‘They’re failing us.’ We see ourselves as outsiders to the systems—an outsider mentality that’s hard to get out of.”
  • Americans haven’t just lost faith in institutions; they’ve come to loathe them, even to think that they are evil
  • 55 percent of Americans believe that the coronavirus that causes COVID-19 was created in a lab and 59 percent believe that the U.S. government is concealing the true number of deaths
  • Half of all Fox News viewers believe that Bill Gates is plotting a mass-vaccination campaign so he can track people.
  • This spring, nearly a third of Americans were convinced that it was probably or definitely true that a vaccine existed but was being withheld by the government.
  • institutions like the law, the government, the police, and even the family don’t merely serve social functions, Levin said; they form the individuals who work and live within them. The institutions provide rules to live by, standards of excellence to live up to, social roles to fulfill.
  • By 2020, people had stopped seeing institutions as places they entered to be morally formed,
  • Instead, they see institutions as stages on which they can perform, can display their splendid selves.
  • People run for Congress not so they can legislate, but so they can get on TV. People work in companies so they can build their personal brand.
  • The result is a world in which institutions not only fail to serve their social function and keep us safe, they also fail to form trustworthy people. The rot in our structures spreads to a rot in ourselves.
  • The Failure of Society
  • The coronavirus has confronted America with a social dilemma. A social dilemma, the University of Pennsylvania scholar Cristina Bicchieri notes, is “a situation in which each group member gets a higher outcome if she pursues her individual self-interest, but everyone in the group is better off if all group members further the common interest.”
  • Social distancing is a social dilemma. Many low-risk individuals have been asked to endure some large pain (unemployment, bankruptcy) and some small inconvenience (mask wearing) for the sake of the common good. If they could make and keep this moral commitment to each other in the short term, the curve would be crushed, and in the long run we’d all be better off. It is the ultimate test of American trustworthiness.
  • While pretending to be rigorous, people relaxed and started going out. It was like watching somebody gradually give up on a diet. There wasn’t a big moment of capitulation, just an extra chocolate bar here, a bagel there, a scoop of ice cream before bed
  • in reality this was a mass moral failure of Republicans and Democrats and independents alike. This was a failure of social solidarity, a failure to look out for each other.
  • Alexis de Tocqueville discussed a concept called the social body. Americans were clearly individualistic, he observed, but they shared common ideas and common values, and could, when needed, produce common action. They could form a social body.
  • Over time, those common values eroded, and were replaced by a value system that put personal freedom above every other value
  • When Americans were confronted with the extremely hard task of locking down for months without any of the collective resources that would have made it easier—habits of deference to group needs; a dense network of community bonds to help hold each other accountable; a history of trust that if you do the right thing, others will too; preexisting patterns of cooperation; a sense of shame if you deviate from the group—they couldn’t do it. America failed.
  • The Crack-up
  • This wasn’t just a political and social crisis, it was also an emotional trauma.
  • The week before George Floyd was killed, the National Center for Health Statistics released data showing that a third of all Americans were showing signs of clinical anxiety or depression. By early June, after Floyd’s death, the percentage of Black Americans showing clinical signs of depression and anxiety disorders had jumped from 36 to 41 percent
  • By late June, American national pride was lower than at any time since Gallup started measuring, in 2001
  • In another poll, 71 percent of Americans said they were angry about the state of the country, and just 17 percent said they were proud.
  • By late June, it was clear that America was enduring a full-bore crisis of legitimacy, an epidemic of alienation, and a loss of faith in the existing order.
  • The most alienated, anarchic actors in society—antifa, the Proud Boys, QAnon—seemed to be driving events. The distrust doom loop was now at hand.
  • The Age of Precarity
  • Cultures are collective responses to common problems. But when reality changes, culture takes a few years, and a moral convulsion, to completely shake off the old norms and values.
  • The culture that is emerging, and which will dominate American life over the next decades, is a response to a prevailing sense of threat.
  • This new culture values security over liberation, equality over freedom, the collective over the individual.
  • From risk to security.
  • we’ve entered an age of precarity in which every political or social movement has an opportunity pole and a risk pole. In the opportunity mentality, risk is embraced because of the upside possibilities. In the risk mindset, security is embraced because people need protection from downside dangers
  • In this period of convulsion, almost every party and movement has moved from its opportunity pole to its risk pole.
  • From achievement to equality
  • In the new culture we are entering, that meritocratic system looks more and more like a ruthless sorting system that excludes the vast majority of people, rendering their life precarious and second class, while pushing the “winners” into a relentless go-go lifestyle that leaves them exhausted and unhappy
  • Equality becomes the great social and political goal. Any disparity—racial, economic, meritocratic—comes to seem hateful.
  • From self to society
  • If we’ve lived through an age of the isolated self, people in the emerging culture see embedded selves. Socialists see individuals embedded in their class group. Right-wing populists see individuals as embedded pieces of a national identity group. Left-wing critical theorists see individuals embedded in their racial, ethnic, gender, or sexual-orientation identity group.
  • The cultural mantra shifts from “Don’t label me!” to “My label is who I am.”
  • From global to local
  • When there is massive distrust of central institutions, people shift power to local institutions, where trust is higher. Power flows away from Washington to cities and states.
  • From liberalism to activism
  • enlightenment liberalism, which was a long effort to reduce the role of passions in politics and increase the role of reason. Politics was seen as a competition between partial truths.
  • Liberalism is ill-suited for an age of precarity. It demands that we live with a lot of ambiguity, which is hard when the atmosphere already feels unsafe. Furthermore, it is thin. It offers an open-ended process of discovery when what people hunger for is justice and moral certainty.
  • liberalism’s niceties come to seem like a cover that oppressors use to mask and maintain their systems of oppression. Public life isn’t an exchange of ideas; it’s a conflict of groups engaged in a vicious death struggle
  • The cultural shifts we are witnessing offer more safety to the individual at the cost of clannishness within society. People are embedded more in communities and groups, but in an age of distrust, groups look at each other warily, angrily, viciously.
  • The shift toward a more communal viewpoint is potentially a wonderful thing, but it leads to cold civil war unless there is a renaissance of trust. There’s no avoiding the core problem. Unless we can find a way to rebuild trust, the nation does not function.
  • How to Rebuild Trust
  • Historians have more to offer, because they can cite examples of nations that have gone from pervasive social decay to relative social health. The two most germane to our situation are Great Britain between 1830 and 1848 and the United States between 1895 and 1914.
  • In both periods, a highly individualistic and amoral culture was replaced by a more communal and moralistic one.
  • But there was a crucial difference between those eras and our own, at least so far. In both cases, moral convulsion led to frenetic action.
  • As Robert Putnam and Shaylyn Romney Garrett note in their forthcoming book, The Upswing, the American civic revival that began in the 1870s produced a stunning array of new organizations: the United Way, the NAACP, the Boy Scouts, the Forest Service, the Federal Reserve System, 4-H clubs, the Sierra Club, the settlement-house movement, the compulsory-education movement, the American Bar Association, the American Legion, the ACLU, and on and on
  • After the civic revivals, both nations witnessed frenetic political reform. During the 1830s, Britain passed the Reform Act, which widened the franchise; the Factory Act, which regulated workplaces; and the Municipal Corporations Act, which reformed local government.
  • The Progressive Era in America saw an avalanche of reform: civil-service reform; food and drug regulation; the Sherman Act, which battled the trusts; the secret ballot; and so on. Civic life became profoundly moralistic, but political life became profoundly pragmatic and anti-ideological. Pragmatism and social-science expertise were valued.
  • Can America in the 2020s turn itself around the way the America of the 1890s, or the Britain of the 1830s, did? Can we create a civic renaissance and a legislative revolution?
  • I see no scenario in which we return to being the nation we were in 1965, with a cohesive national ethos, a clear national establishment, trusted central institutions, and a pop-culture landscape in which people overwhelmingly watch the same shows and talked about the same things.
  • The age of distrust has smashed the converging America and the converging globe—that great dream of the 1990s—and has left us with the reality that our only plausible future is decentralized pluralism.
  • The key to making decentralized pluralism work still comes down to one question: Do we have the energy to build new organizations that address our problems, the way the Brits did in the 1830s and Americans did in the 1890s?
  • social trust is built within organizations in which people are bound together to do joint work, in which they struggle together long enough for trust to gradually develop, in which they develop shared understandings of what is expected of each other, in which they are enmeshed in rules and standards of behavior that keep them trustworthy when their commitments might otherwise falter.
  • Over the past 60 years, we have given up on the Rotary Club and the American Legion and other civic organizations and replaced them with Twitter and Instagram. Ultimately, our ability to rebuild trust depends on our ability to join and stick to organizations.
  • Whether we emerge from this transition stronger depends on our ability, from the bottom up and the top down, to build organizations targeted at our many problems. If history is any guide, this will be the work not of months, but of one or two decades.
  • For centuries, America was the greatest success story on earth, a nation of steady progress, dazzling achievement, and growing international power. That story threatens to end on our watch, crushed by the collapse of our institutions and the implosion of social trust
  • But trust can be rebuilt through the accumulation of small heroic acts—by the outrageous gesture of extending vulnerability in a world that is mean, by proffering faith in other people when that faith may not be returned. Sometimes trust blooms when somebody holds you against all logic, when you expected to be dropped.
  • By David Brooks
Javier E

When the New York Times lost its way - 0 views

  • There are many reasons for Trump’s ascent, but changes in the American news media played a critical role. Trump’s manipulation and every one of his political lies became more powerful because journalists had forfeited what had always been most valuable about their work: their credibility as arbiters of truth and brokers of ideas, which for more than a century, despite all of journalism’s flaws and failures, had been a bulwark of how Americans govern themselves.
  • I think Sulzberger shares this analysis. In interviews and his own writings, including an essay earlier this year for the Columbia Journalism Review, he has defended “independent journalism”, or, as I understand him, fair-minded, truth-seeking journalism that aspires to be open and objective.
  • It’s good to hear the publisher speak up in defence of such values, some of which have fallen out of fashion not just with journalists at the Times and other mainstream publications but at some of the most prestigious schools of journalism.
  • ...204 more annotations...
  • All the empathy and humility in the world will not mean much against the pressures of intolerance and tribalism without an invaluable quality that Sulzberger did not emphasise: courage.
  • Sulzberger seems to underestimate the struggle he is in, that all journalism and indeed America itself is in
  • In describing the essential qualities of independent journalism in his essay, he unspooled a list of admirable traits – empathy, humility, curiosity and so forth. These qualities have for generations been helpful in contending with the Times’s familiar problem, which is liberal bias
  • on their own, these qualities have no chance against the Times’s new, more dangerous problem, which is in crucial respects the opposite of the old one.
  • The Times’s problem has metastasised from liberal bias to illiberal bias, from an inclination to favour one side of the national debate to an impulse to shut debate down altogether
  • the internet knocked the industry off its foundations. Local newspapers were the proving ground between college campuses and national newsrooms. As they disintegrated, the national news media lost a source of seasoned reporters and many Americans lost a journalism whose truth they could verify with their own eyes.
  • far more than when I set out to become a journalist, doing the work right today demands a particular kind of courage:
  • the moral and intellectual courage to take the other side seriously and to report truths and ideas that your own side demonises for fear they will harm its cause.
  • One of the glories of embracing illiberalism is that, like Trump, you are always right about everything, and so you are justified in shouting disagreement down.
  • leaders of many workplaces and boardrooms across America find that it is so much easier to compromise than to confront – to give a little ground today in the belief you can ultimately bring people around
  • This is how reasonable Republican leaders lost control of their party to Trump and how liberal-minded college presidents lost control of their campuses. And it is why the leadership of the New York Times is losing control of its principles.
  • Over the decades the Times and other mainstream news organisations failed plenty of times to live up to their commitments to integrity and open-mindedness. The relentless struggle against biases and preconceptions, rather than the achievement of a superhuman objective omniscience, is what mattered
  • . I thought, and still think, that no American institution could have a better chance than the Times, by virtue of its principles, its history, its people and its hold on the attention of influential Americans, to lead the resistance to the corruption of political and intellectual life, to overcome the encroaching dogmatism and intolerance.
  • As the country became more polarised, the national media followed the money by serving partisan audiences the versions of reality they preferred
  • This relationship proved self-reinforcing. As Americans became freer to choose among alternative versions of reality, their polarisation intensified.
  • as the top editors let bias creep into certain areas of coverage, such as culture, lifestyle and business, that made the core harder to defend and undermined the authority of even the best reporters.
  • here have been signs the Times is trying to recover the courage of its convictions
  • The paper was slow to display much curiosity about the hard question of the proper medical protocols for trans children; but once it did, the editors defended their coverage against the inevitable criticism.
  • As Sulzberger told me in the past, returning to the old standards will require agonising change. He saw that as the gradual work of many years, but I think he is mistaken. To overcome the cultural and commercial pressures the Times faces, particularly given the severe test posed by another Trump candidacy and possible presidency, its publisher and senior editors will have to be bolder than that.
  • As a Democrat from a family of Democrats, a graduate of Yale and a blossom of the imagined meritocracy, I had my first real chance, at Buchanan’s rallies, to see the world through the eyes of stalwart opponents of abortion, immigration and the relentlessly rising tide of modernity.
  • the Times is failing to face up to one crucial reason: that it has lost faith in Americans, too.
  • For now, to assert that the Times plays by the same rules it always has is to commit a hypocrisy that is transparent to conservatives, dangerous to liberals and bad for the country as a whole.
  • It makes the Times too easy for conservatives to dismiss and too easy for progressives to believe.
  • The reality is that the Times is becoming the publication through which America’s progressive elite talks to itself about an America that does not really exist.
  • It is hard to imagine a path back to saner American politics that does not traverse a common ground of shared fact.
  • It is equally hard to imagine how America’s diversity can continue to be a source of strength, rather than become a fatal flaw, if Americans are afraid or unwilling to listen to each other.
  • I suppose it is also pretty grandiose to think you might help fix all that. But that hope, to me, is what makes journalism worth doing.
  • Since Adolph Ochs bought the paper in 1896, one of the most inspiring things the Times has said about itself is that it does its work “without fear or favour”. That is not true of the institution today – it cannot be, not when its journalists are afraid to trust readers with a mainstream conservative argument such as Cotton’s, and its leaders are afraid to say otherwise.
  • Most important, the Times, probably more than any other American institution, could influence the way society approached debate and engagement with opposing views. If Times Opinion demonstrated the same kind of intellectual courage and curiosity that my colleagues at the Atlantic had shown, I hoped, the rest of the media would follow.
  • You did not have to go along with everything that any tribe said. You did not have to pretend that the good guys, much as you might have respected them, were right about everything, or that the bad guys, much as you might have disdained them, never had a point. You did not, in other words, ever have to lie.
  • This fundamental honesty was vital for readers, because it equipped them to make better, more informed judgments about the world. Sometimes it might shock or upset them by failing to conform to their picture of reality. But it also granted them the respect of acknowledging that they were able to work things out for themselves.
  • The Atlantic did not aspire to the same role as the Times. It did not promise to serve up the news of the day without any bias. But it was to opinion journalism what the Times’s reporting was supposed to be to news: honest and open to the world.
  • Those were the glory days of the blog, and we hit on the idea of creating a living op-ed page, a collective of bloggers with different points of view but a shared intellectual honesty who would argue out the meaning of the news of the day
  • They were brilliant, gutsy writers, and their disagreements were deep enough that I used to joke that my main work as editor was to prevent fistfights.
  • Under its owner, David Bradley, my colleagues and I distilled our purpose as publishing big arguments about big ideas
  • we also began producing some of the most important work in American journalism: Nicholas Carr on whether Google was “making us stupid”; Hanna Rosin on “the end of men”; Taylor Branch on “the shame of college sports”; Ta-Nehisi Coates on “the case for reparations”; Greg Lukianoff and Jonathan Haidt on “the coddling of the American mind”.
  • I was starting to see some effects of the new campus politics within the Atlantic. A promising new editor had created a digital form for aspiring freelancers to fill out, and she wanted to ask them to disclose their racial and sexual identity. Why? Because, she said, if we were to write about the trans community, for example, we would ask a trans person to write the story
  • There was a good argument for that, I acknowledged, and it sometimes might be the right answer. But as I thought about the old people, auto workers and abortion opponents I had learned from, I told her there was also an argument for correspondents who brought an outsider’s ignorance, along with curiosity and empathy, to the story.
  • A journalism that starts out assuming it knows the answers, it seemed to me then, and seems even more so to me now, can be far less valuable to the reader than a journalism that starts out with a humbling awareness that it knows nothing.
  • In the age of the internet it is hard even for a child to sustain an “innocent eye”, but the alternative for journalists remains as dangerous as ever, to become propagandists. America has more than enough of those already.
  • When I looked around the Opinion department, change was not what I perceived. Excellent writers and editors were doing excellent work. But the department’s journalism was consumed with politics and foreign affairs in an era when readers were also fascinated by changes in technology, business, science and culture.
  • Fairly quickly, though, I realised two things: first, that if I did my job as I thought it should be done, and as the Sulzbergers said they wanted me to do it, I would be too polarising internally ever to lead the newsroom; second, that I did not want that job, though no one but my wife believed me when I said that.
  • there was a compensating moral and psychological privilege that came with aspiring to journalistic neutrality and open-mindedness, despised as they might understandably be by partisans. Unlike the duelling politicians and advocates of all kinds, unlike the corporate chieftains and their critics, unlike even the sainted non-profit workers, you did not have to pretend things were simpler than they actually were
  • On the right and left, America’s elites now talk within their tribes, and get angry or contemptuous on those occasions when they happen to overhear the other conclave. If they could be coaxed to agree what they were arguing about, and the rules by which they would argue about it, opinion journalism could serve a foundational need of the democracy by fostering diverse and inclusive debate. Who could be against that?
  • The large staff of op-ed editors contained only a couple of women. Although the 11 columnists were individually admirable, only two of them were women and only one was a person of colour
  • Not only did they all focus on politics and foreign affairs, but during the 2016 campaign, no columnist shared, in broad terms, the worldview of the ascendant progressives of the Democratic Party, incarnated by Bernie Sanders. And only two were conservative.
  • This last fact was of particular concern to the elder Sulzberger. He told me the Times needed more conservative voices, and that its own editorial line had become predictably left-wing. “Too many liberals,” read my notes about the Opinion line-up from a meeting I had with him and Mark Thompson, then the chief executive, as I was preparing to rejoin the paper. “Even conservatives are liberals’ idea of a conservative.” The last note I took from that meeting was: “Can’t ignore 150m conservative Americans.”
  • As I knew from my time at the Atlantic, this kind of structural transformation can be frightening and even infuriating for those understandably proud of things as they are. It is hard on everyone
  • experience at the Atlantic also taught me that pursuing new ways of doing journalism in pursuit of venerable institutional principles created enthusiasm for change. I expected that same dynamic to allay concerns at the Times.
  • If Opinion published a wider range of views, it would help frame a set of shared arguments that corresponded to, and drew upon, the set of shared facts coming from the newsroom.
  • New progressive voices were celebrated within the Times. But in contrast to the Wall Street Journal and the Washington Post, conservative voices – even eloquent anti-Trump conservative voices – were despised, regardless of how many leftists might surround them.
  • The Opinion department mocked the paper’s claim to value diversity. It did not have a single black editor
  • Eventually, it sank in that my snotty joke was actually on me: I was the one ignorantly fighting a battle that was already lost. The old liberal embrace of inclusive debate that reflected the country’s breadth of views had given way to a new intolerance for the opinions of roughly half of American voters.
  • Out of naivety or arrogance, I was slow to recognise that at the Times, unlike at the Atlantic, these values were no longer universally accepted, let alone esteemed
  • After the 9/11 attacks, as the bureau chief in Jerusalem, I spent a lot of time in the Gaza Strip interviewing Hamas leaders, recruiters and foot soldiers, trying to understand and describe their murderous ideology. Some readers complained that I was providing a platform for terrorists, but there was never any objection from within the Times.
  • Our role, we knew, was to help readers understand such threats, and this required empathetic – not sympathetic – reporting. This is not an easy distinction but good reporters make it: they learn to understand and communicate the sources and nature of a toxic ideology without justifying it, much less advocating it.
  • Today’s newsroom turns that moral logic on its head, at least when it comes to fellow Americans. Unlike the views of Hamas, the views of many Americans have come to seem dangerous to engage in the absence of explicit condemnation
  • Focusing on potential perpetrators – “platforming” them by explaining rather than judging their views – is believed to empower them to do more harm.
  • After the profile of the Ohio man was published, media Twitter lit up with attacks on the article as “normalising” Nazism and white nationalism, and the Times convulsed internally. The Times wound up publishing a cringing editor’s note that hung the writer out to dry and approvingly quoted some of the criticism, including a tweet from a Washington Post opinion editor asking, “Instead of long, glowing profiles of Nazis/White nationalists, why don’t we profile the victims of their ideologies”?
  • the Times lacked the confidence to defend its own work
  • The editor’s note paraded the principle of publishing such pieces, saying it was important to “shed more light, not less, on the most extreme corners of American life”. But less light is what the readers got. As a reporter in the newsroom, you’d have to have been an idiot after that explosion to attempt such a profile
  • Empathetic reporting about Trump supporters became even more rare. It became a cliché among influential left-wing columnists and editors that blinkered political reporters interviewed a few Trump supporters in diners and came away suckered into thinking there was something besides racism that could explain anyone’s support for the man.
  • After a year spent publishing editorials attacking Trump and his policies, I thought it would be a demonstration of Timesian open-mindedness to give his supporters their say. Also, I thought the letters were interesting, so I turned over the entire editorial page to the Trump letters.
  • I wasn’t surprised that we got some criticism on Twitter. But I was astonished by the fury of my Times colleagues. I found myself facing an angry internal town hall, trying to justify what to me was an obvious journalistic decision
  • Didn’t he think other Times readers should understand the sources of Trump’s support? Didn’t he also see it was a wonderful thing that some Trump supporters did not just dismiss the Times as fake news, but still believed in it enough to respond thoughtfully to an invitation to share their views?
  • And if the Times could not bear to publish the views of Americans who supported Trump, why should it be surprised that those voters would not trust it?
  • Two years later, in 2020, Baquet acknowledged that in 2016 the Times had failed to take seriously the idea that Trump could become president partly because it failed to send its reporters out into America to listen to voters and understand “the turmoil in the country”. And, he continued, the Times still did not understand the views of many Americans
  • Speaking four months before we published the Cotton op-ed, he said that to argue that the views of such voters should not appear in the Times was “not journalistic”.
  • Conservative arguments in the Opinion pages reliably started uproars within the Times. Sometimes I would hear directly from colleagues who had the grace to confront me with their concerns; more often they would take to the company’s Slack channels or Twitter to advertise their distress in front of each other
  • This environment of enforced group-think, inside and outside the paper, was hard even on liberal opinion writers. One left-of-centre columnist told me that he was reluctant to appear in the New York office for fear of being accosted by colleagues.
  • An internal survey shortly after I left the paper found that barely half the staff, within an enterprise ostensibly devoted to telling the truth, agreed “there is a free exchange of views in this company” and “people are not afraid to say what they really think”.)
  • Even columnists with impeccable leftist bona fides recoiled from tackling subjects when their point of view might depart from progressive orthodoxy.
  • The bias had become so pervasive, even in the senior editing ranks of the newsroom, as to be unconscious
  • Trying to be helpful, one of the top newsroom editors urged me to start attaching trigger warnings to pieces by conservatives. It had not occurred to him how this would stigmatise certain colleagues, or what it would say to the world about the Times’s own bias
  • By their nature, information bubbles are powerfully self-reinforcing, and I think many Times staff have little idea how closed their world has become, or how far they are from fulfilling their compact with readers to show the world “without fear or favour”
  • sometimes the bias was explicit: one newsroom editor told me that, because I was publishing more conservatives, he felt he needed to push his own department further to the left.
  • The Times’s failure to honour its own stated principles of openness to a range of views was particularly hard on the handful of conservative writers, some of whom would complain about being flyspecked and abused by colleagues. One day when I relayed a conservative’s concern about double standards to Sulzberger, he lost his patience. He told me to inform the complaining conservative that that’s just how it was: there was a double standard and he should get used to it.
  • A publication that promises its readers to stand apart from politics should not have different standards for different writers based on their politics. But I delivered the message. There are many things I regret about my tenure as editorial-page editor. That is the only act of which I am ashamed.
  • I began to think of myself not as a benighted veteran on a remote island, but as Rip Van Winkle. I had left one newspaper, had a pleasant dream for ten years, and returned to a place I barely recognised.
  • The new New York Times was the product of two shocks – sudden collapse, and then sudden success. The paper almost went bankrupt during the financial crisis, and the ensuing panic provoked a crisis of confidence among its leaders. Digital competitors like the HuffPost were gaining readers and winning plaudits within the media industry as innovative. They were the cool kids; Times folk were ink-stained wrinklies.
  • In its panic, the Times bought out experienced reporters and editors and began hiring journalists from publications like the HuffPost who were considered “digital natives” because they had never worked in print. This hiring quickly became easier, since most digital publications financed by venture capital turned out to be bad businesses
  • Though they might have lacked deep or varied reporting backgrounds, some of the Times’s new hires brought skills in video and audio; others were practised at marketing themselves – building their brands, as journalists now put it – in social media. Some were brilliant and fiercely honest, in keeping with the old aspirations of the paper.
  • critically, the Times abandoned its practice of acculturation, including those months-long assignments on Metro covering cops and crime or housing. Many new hires who never spent time in the streets went straight into senior writing and editing roles.
  • All these recruits arrived with their own notions of the purpose of the Times. To me, publishing conservatives helped fulfil the paper’s mission; to them, I think, it betrayed that mission.
  • then, to the shock and horror of the newsroom, Trump won the presidency. In his article for Columbia Journalism Review, Sulzberger cites the Times’s failure to take Trump’s chances seriously as an example of how “prematurely shutting down inquiry and debate” can allow “conventional wisdom to ossify in a way that blinds society.
  • Many Times staff members – scared, angry – assumed the Times was supposed to help lead the resistance. Anxious for growth, the Times’s marketing team implicitly endorsed that idea, too.
  • As the number of subscribers ballooned, the marketing department tracked their expectations, and came to a nuanced conclusion. More than 95% of Times subscribers described themselves as Democrats or independents, and a vast majority of them believed the Times was also liberal
  • A similar majority applauded that bias; it had become “a selling point”, reported one internal marketing memo. Yet at the same time, the marketers concluded, subscribers wanted to believe that the Times was independent.
  • As that memo argued, even if the Times was seen as politically to the left, it was critical to its brand also to be seen as broadening its readers’ horizons, and that required “a perception of independence”.
  • Readers could cancel their subscriptions if the Times challenged their worldview by reporting the truth without regard to politics. As a result, the Times’s long-term civic value was coming into conflict with the paper’s short-term shareholder value
  • The Times has every right to pursue the commercial strategy that makes it the most money. But leaning into a partisan audience creates a powerful dynamic. Nobody warned the new subscribers to the Times that it might disappoint them by reporting truths that conflicted with their expectations
  • When your product is “independent journalism”, that commercial strategy is tricky, because too much independence might alienate your audience, while too little can lead to charges of hypocrisy that strike at the heart of the brand.
  • It became one of Dean Baquet’s frequent mordant jokes that he missed the old advertising-based business model, because, compared with subscribers, advertisers felt so much less sense of ownership over the journalism
  • The Times was slow to break it to its readers that there was less to Trump’s ties to Russia than they were hoping, and more to Hunter Biden’s laptop, that Trump might be right that covid came from a Chinese lab, that masks were not always effective against the virus, that shutting down schools for many months was a bad idea.
  • there has been a sea change over the past ten years in how journalists think about pursuing justice. The reporters’ creed used to have its foundation in liberalism, in the classic philosophical sense. The exercise of a reporter’s curiosity and empathy, given scope by the constitutional protections of free speech, would equip readers with the best information to form their own judgments. The best ideas and arguments would win out
  • The journalist’s role was to be a sworn witness; the readers’ role was to be judge and jury. In its idealised form, journalism was lonely, prickly, unpopular work, because it was only through unrelenting scepticism and questioning that society could advance. If everyone the reporter knew thought X, the reporter’s role was to ask: why X?
  • Illiberal journalists have a different philosophy, and they have their reasons for it. They are more concerned with group rights than individual rights, which they regard as a bulwark for the privileges of white men. They have seen the principle of  free speech used to protect right-wing outfits like Project Veritas and Breitbart News and are uneasy with it.
  • They had their suspicions of their fellow citizens’ judgment confirmed by Trump’s election, and do not believe readers can be trusted with potentially dangerous ideas or facts. They are not out to achieve social justice as the knock-on effect of pursuing truth; they want to pursue it head-on
  • The term “objectivity” to them is code for ignoring the poor and weak and cosying up to power, as journalists often have done.
  • And they do not just want to be part of the cool crowd. They need to be
  • To be more valued by their peers and their contacts – and hold sway over their bosses – they need a lot of followers in social media. That means they must be seen to applaud the right sentiments of the right people in social media
  • The journalist from central casting used to be a loner, contrarian or a misfit. Now journalism is becoming another job for joiners, or, to borrow Twitter’s own parlance, “followers”, a term that mocks the essence of a journalist’s role.
  • The new newsroom ideology seems idealistic, yet it has grown from cynical roots in academia: from the idea that there is no such thing as objective truth; that there is only narrative, and that therefore whoever controls the narrative – whoever gets to tell the version of the story that the public hears – has the whip hand
  • What matters, in other words, is not truth and ideas in themselves, but the power to determine both in the public mind.
  • By contrast, the old newsroom ideology seems cynical on its surface. It used to bug me that my editors at the Times assumed every word out of the mouth of any person in power was a lie.
  • And the pursuit of objectivity can seem reptilian, even nihilistic, in its abjuration of a fixed position in moral contests. But the basis of that old newsroom approach was idealistic: the notion that power ultimately lies in truth and ideas, and that the citizens of a pluralistic democracy, not leaders of any sort, must be trusted to judge both.
  • Our role in Times Opinion, I used to urge my colleagues, was not to tell people what to think, but to help them fulfil their desire to think for themselves.
  • It seems to me that putting the pursuit of truth, rather than of justice, at the top of a publication’s hierarchy of values also better serves not just truth but justice, too
  • over the long term journalism that is not also sceptical of the advocates of any form of justice and the programmes they put forward, and that does not struggle honestly to understand and explain the sources of resistance,
  • will not assure that those programmes will work, and it also has no legitimate claim to the trust of reasonable people who see the world very differently. Rather than advance understanding and durable change, it provokes backlash.
  • The impatience within the newsroom with such old ways was intensified by the generational failure of the Times to hire and promote women and non-white people
  • Pay attention if you are white at the Times and you will hear black editors speak of hiring consultants at their own expense to figure out how to get white staff to respect them
  • As wave after wave of pain and outrage swept through the Times, over a headline that was not damning enough of Trump or someone’s obnoxious tweets, I came to think of the people who were fragile, the ones who were caught up in Slack or Twitter storms, as people who had only recently discovered that they were white and were still getting over the shock.
  • Having concluded they had got ahead by working hard, it has been a revelation to them that their skin colour was not just part of the wallpaper of American life, but a source of power, protection and advancement.
  • I share the bewilderment that so many people could back Trump, given the things he says and does, and that makes me want to understand why they do: the breadth and diversity of his support suggests not just racism is at work. Yet these elite, well-meaning Times staff cannot seem to stretch the empathy they are learning to extend to people with a different skin colour to include those, of whatever race, who have different politics.
  • The digital natives were nevertheless valuable, not only for their skills but also because they were excited for the Times to embrace its future. That made them important allies of the editorial and business leaders as they sought to shift the Times to digital journalism and to replace staff steeped in the ways of print. Partly for that reason, and partly out of fear, the leadership indulged internal attacks on Times journalism, despite pleas from me and others, to them and the company as a whole, that Times folk should treat each other with more respect
  • My colleagues and I in Opinion came in for a lot of the scorn, but we were not alone. Correspondents in the Washington bureau and political reporters would take a beating, too, when they were seen as committing sins like “false balance” because of the nuance in their stories.
  • My fellow editorial and commercial leaders were well aware of how the culture of the institution had changed. As delighted as they were by the Times’s digital transformation they were not blind to the ideological change that came with it. They were unhappy with the bullying and group-think; we often discussed such cultural problems in the weekly meetings of the executive committee, composed of the top editorial and business leaders, including the publisher. Inevitably, these bitch sessions would end with someone saying a version of: “Well, at some point we have to tell them this is what we believe in as a newspaper, and if they don’t like it they should work somewhere else.” It took me a couple of years to realise that this moment was never going to come.
  • There is a lot not to miss about the days when editors like Boyd could strike terror in young reporters like me and Purdum. But the pendulum has swung so far in the other direction that editors now tremble before their reporters and even their interns. “I miss the old climate of fear,” Baquet used to say with a smile, in another of his barbed jokes.
  • I wish I’d pursued my point and talked myself out of the job. This contest over control of opinion journalism within the Times was not just a bureaucratic turf battle (though it was that, too)
  • The newsroom’s embrace of opinion journalism has compromised the Times’s independence, misled its readers and fostered a culture of intolerance and conformity.
  • The Opinion department is a relic of the era when the Times enforced a line between news and opinion journalism.
  • Editors in the newsroom did not touch opinionated copy, lest they be contaminated by it, and opinion journalists and editors kept largely to their own, distant floor within the Times building. Such fastidiousness could seem excessive, but it enforced an ethos that Times reporters owed their readers an unceasing struggle against bias in the news
  • But by the time I returned as editorial-page editor, more opinion columnists and critics were writing for the newsroom than for Opinion. As at the cable news networks, the boundaries between commentary and news were disappearing, and readers had little reason to trust that Times journalists were resisting rather than indulging their biases
  • The Times newsroom had added more cultural critics, and, as Baquet noted, they were free to opine about politics.
  • Departments across the Times newsroom had also begun appointing their own “columnists”, without stipulating any rules that might distinguish them from columnists in Opinion
  • I checked to see if, since I left the Times, it had developed guidelines explaining the difference, if any, between a news columnist and opinion columnist. The paper’s spokeswoman, Danielle Rhoades Ha, did not respond to the question.)
  • The internet rewards opinionated work and, as news editors felt increasing pressure to generate page views, they began not just hiring more opinion writers but also running their own versions of opinionated essays by outside voices – historically, the province of Opinion’s op-ed department.
  • Yet because the paper continued to honour the letter of its old principles, none of this work could be labelled “opinion” (it still isn’t). After all, it did not come from the Opinion department.
  • And so a newsroom technology columnist might call for, say, unionisation of the Silicon Valley workforce, as one did, or an outside writer might argue in the business section for reparations for slavery, as one did, and to the average reader their work would appear indistinguishable from Times news articles.
  • By similarly circular logic, the newsroom’s opinion journalism breaks another of the Times’s commitments to its readers. Because the newsroom officially does not do opinion – even though it openly hires and publishes opinion journalists – it feels free to ignore Opinion’s mandate to provide a diversity of views
  • When I was editorial-page editor, there were a couple of newsroom columnists whose politics were not obvious. But the other newsroom columnists, and the critics, read as passionate progressives.
  • I urged Baquet several times to add a conservative to the newsroom roster of cultural critics. That would serve the readers by diversifying the Times’s analysis of culture, where the paper’s left-wing bias had become most blatant, and it would show that the newsroom also believed in restoring the Times’s commitment to taking conservatives seriously. He said this was a good idea, but he never acted on it
  • I couldn’t help trying the idea out on one of the paper’s top cultural editors, too: he told me he did not think Times readers would be interested in that point of view.
  • opinion was spreading through the newsroom in other ways. News desks were urging reporters to write in the first person and to use more “voice”, but few newsroom editors had experience in handling that kind of journalism, and no one seemed certain where “voice” stopped and “opinion” began
  • The Times magazine, meanwhile, became a crusading progressive publication
  • Baquet liked to say the magazine was Switzerland, by which he meant that it sat between the newsroom and Opinion. But it reported only to the news side. Its work was not labelled as opinion and it was free to omit conservative viewpoints.
  • his creep of politics into the newsroom’s journalism helped the Times beat back some of its new challengers, at least those on the left
  • Competitors like Vox and the HuffPost were blending leftish politics with reporting and writing it up conversationally in the first person. Imitating their approach, along with hiring some of their staff, helped the Times repel them. But it came at a cost. The rise of opinion journalism over the past 15 years changed the newsroom’s coverage and its culture
  • The tiny redoubt of never-Trump conservatives in Opinion is swamped daily not only by the many progressives in that department but their reinforcements among the critics, columnists and magazine writers in the newsroom
  • They are generally excellent, but their homogeneity means Times readers are being served a very restricted range of views, some of them presented as straight news by a publication that still holds itself out as independent of any politics.
  • And because the critics, newsroom columnists and magazine writers are the newsroom’s most celebrated journalists, they have disproportionate influence over the paper’s culture.
  • By saying that it still holds itself to the old standard of strictly separating its news and opinion journalists, the paper leads its readers further into the trap of thinking that what they are reading is independent and impartial – and this misleads them about their country’s centre of political and cultural gravity.
  • And yet the Times insists to the public that nothing has changed.
  • “Even though each day’s opinion pieces are typically among our most popular journalism and our columnists are among our most trusted voices, we believe opinion is secondary to our primary mission of reporting and should represent only a portion of a healthy news diet,” Sulzberger wrote in the Columbia Journalism Review. “For that reason, we’ve long kept the Opinion department intentionally small – it represents well under a tenth of our journalistic staff – and ensured that its editorial decision-making is walled off from the newsroom.”
  • When I was editorial-page editor, Sulzberger, who declined to be interviewed on the record for this article, worried a great deal about the breakdown in the boundaries between news and opinion
  • He told me once that he would like to restructure the paper to have one editor oversee all its news reporters, another all its opinion journalists and a third all its service journalists, the ones who supply guidance on buying gizmos or travelling abroad. Each of these editors would report to him
  • That is the kind of action the Times needs to take now to confront its hypocrisy and begin restoring its independence.
  • The Times could learn something from the Wall Street Journal, which has kept its journalistic poise
  • It has maintained a stricter separation between its news and opinion journalism, including its cultural criticism, and that has protected the integrity of its work.
  • After I was chased out of the Times, Journal reporters and other staff attempted a similar assault on their opinion department. Some 280 of them signed a letter listing pieces they found offensive and demanding changes in how their opinion colleagues approached their work. “Their anxieties aren’t our responsibility,” shrugged the Journal’s editorial board in a note to readers after the letter was leaked. “The signers report to the news editors or other parts of the business.” The editorial added, in case anyone missed the point, “We are not the New York Times.” That was the end of it.
  • Unlike the publishers of the Journal, however, Sulzberger is in a bind, or at least perceives himself to be
  • The confusion within the Times over its role, and the rising tide of intolerance among the reporters, the engineers, the business staff, even the subscribers – these are all problems he inherited, in more ways than one. He seems to feel constrained in confronting the paper’s illiberalism by the very source of his authority
  • The paradox is that in previous generations the Sulzbergers’ control was the bulwark of the paper’s independence.
  • if he is going to instil the principles he believes in, he needs to stop worrying so much about his powers of persuasion, and start using the power he is so lucky to have.
  • Shortly after we published the op-ed that Wednesday afternoon, some reporters tweeted their opposition to Cotton’s argument. But the real action was in the Times’s Slack channels, where reporters and other staff began not just venting but organising. They turned to the union to draw up a workplace complaint about the op-ed.
  • The next day, this reporter shared the byline on the Times story about the op-ed. That article did not mention that Cotton had distinguished between “peaceful, law-abiding protesters” and “rioters and looters”. In fact, the first sentence reported that Cotton had called for “the military to suppress protests against police violence”.
  • This was – and is – wrong. You don’t have to take my word for that. You can take the Times’s
  • Three days later in its article on my resignation it also initially reported that Cotton had called “for military force against protesters in American cities”. This time, after the article was published on the Times website, the editors scrambled to rewrite it, replacing “military force” with “military response” and “protesters” with “civic unrest”
  • That was a weaselly adjustment – Cotton wrote about criminality, not “unrest” – but the article at least no longer unambiguously misrepresented Cotton’s argument to make it seem he was in favour of crushing democratic protest. The Times did not publish a correction or any note acknowledging the story had been changed.
  • Seeking to influence the outcome of a story you cover, particularly without disclosing that to the reader, violates basic principles I was raised on at the Times
  • s Rhoades Ha disputes my characterisation of the after-the-fact editing of the story about my resignation. She said the editors changed the story after it was published on the website in order to “refine” it and “add context”, and so the story did not merit a correction disclosing to the reader that changes had been made.
  • In retrospect what seems almost comical is that as the conflict over Cotton’s op-ed unfolded within the Times I acted as though it was on the level, as though the staff of the Times would have a good-faith debate about Cotton’s piece and the decision to publish it
  • Instead, people wanted to vent and achieve what they considered to be justice, whether through Twitter, Slack, the union or the news pages themselves
  • My colleagues in Opinion, together with the PR team, put together a series of connected tweets describing the purpose behind publishing Cotton’s op-ed. Rather than publish these tweets from the generic Times Opinion Twitter account, Sulzberger encouraged me to do it from my personal one, on the theory that this would humanise our defence. I doubted that would make any difference, but it was certainly my job to take responsibility. So I sent out the tweets, sticking my head in a Twitter bucket that clangs, occasionally, to this day
  • What is worth recalling now from the bedlam of the next two days? I suppose there might be lessons for someone interested in how not to manage a corporate crisis. I began making my own mistakes that Thursday. The union condemned our publication of Cotton, for supposedly putting journalists in danger, claiming that he had called on the military “to ‘detain’ and ‘subdue’ Americans protesting racism and police brutality” – again, a misrepresentation of his argument. The publisher called to tell me the company was experiencing its largest sick day in history; people were turning down job offers because of the op-ed, and, he said, some people were quitting. He had been expecting for some time that the union would seek a voice in editorial decision-making; he said he thought this was the moment the union was making its move. He had clearly changed his own mind about the value of publishing the Cotton op-ed.
  • I asked Dao to have our fact-checkers review the union’s claims. But then I went a step further: at the publisher’s request, I urged him to review the editing of the piece itself and come back to me with a list of steps we could have taken to make it better. Dao’s reflex – the correct one – was to defend the piece as published. He and three other editors of varying ages, genders and races had helped edit it; it had been fact-checked, as is all our work
  • This was my last failed attempt to have the debate within the Times that I had been seeking for four years, about why it was important to present Times readers with arguments like Cotton’s. The staff at the paper never wanted to have that debate. The Cotton uproar was the most extreme version of the internal reaction we faced whenever we published conservative arguments that were not simply anti-Trump. Yes, yes, of course we believe in the principle of publishing diverse views, my Times colleagues would say, but why this conservative? Why this argument?
  • I doubt these changes would have mattered, and to extract this list from Dao was to engage in precisely the hypocrisy I claimed to despise – that, in fact, I do despise. If Cotton needed to be held to such standards of politesse, so did everyone else. Headlines such as “Tom Cotton’s Fascist Op-ed”, the headline of a subsequent piece, should also have been tranquillised.
  • As that miserable Thursday wore on, Sulzberger, Baquet and I held a series of Zoom meetings with reporters and editors from the newsroom who wanted to discuss the op-ed. Though a handful of the participants were there to posture, these were generally constructive conversations. A couple of people, including Baquet, even had the guts to speak up in favour of publishing the op-ed
  • Two moments stick out. At one point, in answer to a question, Sulzberger and Baquet both said they thought the op-ed – as the Times union and many journalists were saying – had in fact put journalists in danger. That was the first time I realised I might be coming to the end of the road.
  • The other was when a pop-culture reporter asked if I had read the op-ed before it was published. I said I had not. He immediately put his head down and started typing, and I should have paid attention rather than moving on to the next question. He was evidently sharing the news with the company over Slack.
  • Every job review I had at the Times urged me to step back from the daily coverage to focus on the long term. (Hilariously, one review, urging me to move faster in upending the Opinion department, instructed me to take risks and “ask for forgiveness not permission”.)
  • I learned when these meetings were over that there had been a new eruption in Slack. Times staff were saying that Rubenstein had been the sole editor of the op-ed. In response, Dao had gone into Slack to clarify to the entire company that he had also edited it himself. But when the Times posted the news article that evening, it reported, “The Op-Ed was edited by Adam Rubenstein” and made no mention of Dao’s statement
  • Early that morning, I got an email from Sam Dolnick, a Sulzberger cousin and a top editor at the paper, who said he felt “we” – he could have only meant me – owed the whole staff “an apology for appearing to place an abstract idea like open debate over the value of our colleagues’ lives, and their safety”. He was worried that I and my colleagues had unintentionally sent a message to other people at the Times that: “We don’t care about their full humanity and their security as much as we care about our ideas.”
  • “I know you don’t like it when I talk about principles at a moment like this,” I began. But I viewed the journalism I had been doing, at the Times and before that at the Atlantic, in very different terms from the ones Dolnick presumed. “I don’t think of our work as an abstraction without meaning for people’s lives – quite the opposite,” I continued. “The whole point – the reason I do this – is to have an impact on their lives to the good. I have always believed that putting ideas, including potentially dangerous one[s], out in the public is vital to ensuring they are debated and, if dangerous, discarded.” It was, I argued, in “edge cases like this that principles are tested”, and if my position was judged wrong then “I am out of step with the times.” But, I concluded, “I don’t think of us as some kind of debating society without implications for the real world and I’ve never been unmindful of my colleagues’ humanity.”
  • in the end, one thing he and I surely agree on is that I was, in fact, out of step with the Times. It may have raised me as a journalist – and invested so much in educating me to what were once its standards – but I did not belong there any more.
  • Finally, I came up with something that felt true. I told the meeting that I was sorry for the pain that my leadership of Opinion had caused. What a pathetic thing to say. I did not think to add, because I’d lost track of this truth myself by then, that opinion journalism that never causes pain is not journalism. It can’t hope to move society forward
  • As I look back at my notes of that awful day, I don’t regret what I said. Even during that meeting, I was still hoping the blow-up might at last give me the chance either to win support for what I had been asked to do, or to clarify once and for all that the rules for journalism had changed at the Times.
  • But no one wanted to talk about that. Nor did they want to hear about all the voices of vulnerable or underprivileged people we had been showcasing in Opinion, or the ambitious new journalism we were doing. Instead, my Times colleagues demanded to know things such as the names of every editor who had had a role in the Cotton piece. Having seen what happened to Rubenstein I refused to tell them. A Slack channel had been set up to solicit feedback in real time during the meeting, and it was filling with hate. The meeting ran long, and finally came to a close after 90 minutes.
  • I tried to insist, as did Dao, that the note make clear the Cotton piece was within our editorial bounds. Sulzberger said he felt the Times could afford to be “silent” on that question. In the end the note went far further in repudiating the piece than I anticipated, saying it should never have been published at all. The next morning I was told to resign.
  • It was a terrible moment for the country. By the traditional – and perverse – logic of journalism, that should also have made it an inspiring time to be a reporter, writer or editor. Journalists are supposed to run towards scenes that others are fleeing, towards hard truths others need to know, towards consequential ideas they would prefer to ignore.
  • But fear got all mixed up with anger inside the Times, too, along with a desire to act locally in solidarity with the national movement. That energy found a focus in the Cotton op-ed
  • the Times is not good at acknowledging mistakes. Indeed, one of my own, within the Times culture, was to take responsibility for any mistakes my department made, and even some it didn’t
  • To Sulzberger, the meltdown over Cotton’s op-ed and my departure in disgrace are explained and justified by a failure of editorial “process”. As he put it in an interview with the New Yorker this summer, after publishing his piece in the Columbia Journalism Review, Cotton’s piece was not “perfectly fact-checked” and the editors had not “thought about the headline and presentation”. He contrasted the execution of Cotton’s opinion piece with that of a months-long investigation the newsroom did of Donald Trump’s taxes (which was not “perfectly fact-checked”, as it happens – it required a correction). He did not explain why, if the Times was an independent publication, an op-ed making a mainstream conservative argument should have to meet such different standards from an op-ed making any other kind of argument, such as for the abolition of the police
  • “It’s not enough just to have the principle and wave it around,” he said. “You also have to execute on it.”
  • To me, extolling the virtue of independent journalism in the pages of the Columbia Journalism Review is how you wave a principle around. Publishing a piece like Cotton’s is how you execute on it.
  • As Sulzberger also wrote in the Review, “Independent journalism, especially in a pluralistic democracy, should err on the side of treating areas of serious political contest as open, unsettled, and in need of further inquiry.
  • If Sulzberger must insist on comparing the execution of the Cotton op-ed with that of the most ambitious of newsroom projects, let him compare it with something really important, the 1619 Project, which commemorated the 400th anniversary of the arrival of enslaved Africans in Virginia.
  • Like Cotton’s piece, the 1619 Project was fact-checked and copy-edited (most of the Times newsroom does not fact-check or copy-edit articles, but the magazine does). But it nevertheless contained mistakes, as journalism often does. Some of these mistakes ignited a firestorm among historians and other readers.
  • And, like Cotton’s piece, the 1619 Project was presented in a way the Times later judged to be too provocative.
  • The Times declared that the 1619 Project “aims to reframe the country’s history, understanding 1619 as our true founding”. That bold statement – a declaration of Times fact, not opinion, since it came from the newsroom – outraged many Americans who venerated 1776 as the founding. The Times later stealthily erased it from the digital version of the project, but was caught doing so by a writer for the publication Quillette. Sulzberger told me during the initial uproar that the top editors in the newsroom – not just Baquet but his deputy – had not reviewed the audacious statement of purpose, one of the biggest editorial claims the paper has ever made. They also, of course, did not edit all the pieces themselves, trusting the magazine’s editors to do that work.
  • If the 1619 Project and the Cotton op-ed shared the same supposed flaws and excited similar outrage, how come that one is lauded as a landmark success and the other is a sackable offence?
  • I am comparing them only to meet Sulzberger on his terms, in order to illuminate what he is trying to elide. What distinguished the Cotton piece was not an error, or strong language, or that I didn’t edit it personally. What distinguished that op-ed was not process. It was politics.
  • It is one thing for the Times to aggravate historians, or conservatives, or even old-school liberals who believe in open debate. It has become quite another for the Times to challenge some members of its own staff with ideas that might contradict their view of the world.
  • The lessons of the incident are not about how to write a headline but about how much the Times has changed – how digital technology, the paper’s new business model and the rise of new ideals among its staff have altered its understanding of the boundary between news and opinion, and of the relationship between truth and justice
  • Ejecting me was one way to avoid confronting the question of which values the Times is committed to. Waving around the word “process” is another.
  • As he asserts the independence of Times journalism, Sulzberger is finding it necessary to reach back several years to another piece I chose to run, for proof that the Times remains willing to publish views that might offend its staff. “We’ve published a column by the head of the part of the Taliban that kidnapped one of our own journalists,” he told the New Yorker. He is missing the real lesson of that piece, as well.
  • The case against that piece is that Haqqani, who remains on the FBI’s most-wanted terrorist list, may have killed Americans. It’s puzzling: in what moral universe can it be a point of pride to publish a piece by an enemy who may have American blood on his hands, and a matter of shame to publish a piece by an American senator arguing for American troops to protect Americans?
  • As Mitch McConnell, then the majority leader, said on the Senate floor about the Times’s panic over the Cotton op-ed, listing some other debatable op-ed choices, “Vladimir Putin? No problem. Iranian propaganda? Sure. But nothing, nothing could have prepared them for 800 words from the junior senator from Arkansas.”
  • The Times’s staff members are not often troubled by obnoxious views when they are held by foreigners. This is an important reason the paper’s foreign coverage, at least of some regions, remains exceptional.
  • What seems most important and least understood about that episode is that it demonstrated in real time the value of the ideals that I poorly defended in the moment, ideals that not just the Times’s staff but many other college-educated Americans are abandoning.
  • After all, we ran the experiment; we published the piece. Was any Times journalist hurt? No. Nobody in the country was. In fact, though it is impossible to know the op-ed’s precise effect, polling showed that support for a military option dropped after the Times published the essay, as the Washington Post’s media critic, Erik Wemple, has written
  • If anything, in other words, publishing the piece stimulated debate that made it less likely Cotton’s position would prevail. The liberal, journalistic principle of open debate was vindicated in the very moment the Times was fleeing from it.
Javier E

President Obama's Interview With Jeffrey Goldberg on Syria and Foreign Policy - The Atl... - 0 views

  • The president believes that Churchillian rhetoric and, more to the point, Churchillian habits of thought, helped bring his predecessor, George W. Bush, to ruinous war in Iraq.
  • Obama entered the White House bent on getting out of Iraq and Afghanistan; he was not seeking new dragons to slay. And he was particularly mindful of promising victory in conflicts he believed to be unwinnable. “If you were to say, for instance, that we’re going to rid Afghanistan of the Taliban and build a prosperous democracy instead, the president is aware that someone, seven years later, is going to hold you to that promise,” Ben Rhodes, Obama’s deputy national-security adviser, and his foreign-policy amanuensis, told me not long ago.
  • Power is a partisan of the doctrine known as “responsibility to protect,” which holds that sovereignty should not be considered inviolate when a country is slaughtering its own citizens. She lobbied him to endorse this doctrine in the speech he delivered when he accepted the Nobel Peace Prize in 2009, but he declined. Obama generally does not believe a president should place American soldiers at great risk in order to prevent humanitarian disasters, unless those disasters pose a direct security threat to the United States.
  • ...162 more annotations...
  • Obama’s resistance to direct intervention only grew. After several months of deliberation, he authorized the CIA to train and fund Syrian rebels, but he also shared the outlook of his former defense secretary, Robert Gates, who had routinely asked in meetings, “Shouldn’t we finish up the two wars we have before we look for another?”
  • In his first term, he came to believe that only a handful of threats in the Middle East conceivably warranted direct U.S. military intervention. These included the threat posed by al‑Qaeda; threats to the continued existence of Israel (“It would be a moral failing for me as president of the United States” not to defend Israel, he once told me); and, not unrelated to Israel’s security, the threat posed by a nuclear-armed Iran.
  • Bush and Scowcroft removed Saddam Hussein’s army from Kuwait in 1991, and they deftly managed the disintegration of the Soviet Union; Scowcroft also, on Bush’s behalf, toasted the leaders of China shortly after the slaughter in Tiananmen Square.
  • As Obama was writing his campaign manifesto, The Audacity of Hope, in 2006, Susan Rice, then an informal adviser, felt it necessary to remind him to include at least one line of praise for the foreign policy of President Bill Clinton, to partially balance the praise he showered on Bush and Scowcroft.
  • “When you have a professional army,” he once told me, “that is well armed and sponsored by two large states”—Iran and Russia—“who have huge stakes in this, and they are fighting against a farmer, a carpenter, an engineer who started out as protesters and suddenly now see themselves in the midst of a civil conflict …” He paused. “The notion that we could have—in a clean way that didn’t commit U.S. military forces—changed the equation on the ground there was never true.”
  • The message Obama telegraphed in speeches and interviews was clear: He would not end up like the second President Bush—a president who became tragically overextended in the Middle East, whose decisions filled the wards of Walter Reed with grievously wounded soldiers, who was helpless to stop the obliteration of his reputation, even when he recalibrated his policies in his second term. Obama would say privately that the first task of an American president in the post-Bush international arena was “Don’t do stupid shit.”
  • Hillary Clinton, when she was Obama’s secretary of state, argued for an early and assertive response to Assad’s violence. In 2014, after she left office, Clinton told me that “the failure to help build up a credible fighting force of the people who were the originators of the protests against Assad … left a big vacuum, which the jihadists have now filled.” When The Atlantic published this statement, and also published Clinton’s assessment that “great nations need organizing principles, and ‘Don’t do stupid stuff’ is not an organizing principle,” Obama became “rip-shit angry,” according to one of his senior advisers. The president did not understand how “Don’t do stupid shit” could be considered a controversial slogan.
  • The Iraq invasion, Obama believed, should have taught Democratic interventionists like Clinton, who had voted for its authorization, the dangers of doing stupid shit. (Clinton quickly apologized to Obama for her comments,
  • Obama, unlike liberal interventionists, is an admirer of the foreign-policy realism of President George H. W. Bush and, in particular, of Bush’s national-security adviser, Brent Scowcroft (“I love that guy,” Obama once told me).
  • The danger to the United States posed by the Assad regime did not rise to the level of these challenges.
  • Obama generally believes that the Washington foreign-policy establishment, which he secretly disdains, makes a fetish of “credibility”—particularly the sort of credibility purchased with force. The preservation of credibility, he says, led to Vietnam. Within the White House, Obama would argue that “dropping bombs on someone to prove that you’re willing to drop bombs on someone is just about the worst reason to use force.”
  • American national-security credibility, as it is conventionally understood in the Pentagon, the State Department, and the cluster of think tanks headquartered within walking distance of the White House, is an intangible yet potent force—one that, when properly nurtured, keeps America’s friends feeling secure and keeps the international order stable.
  • All week, White House officials had publicly built the case that Assad had committed a crime against humanity. Kerry’s speech would mark the culmination of this campaign.
  • But the president had grown queasy. In the days after the gassing of Ghouta, Obama would later tell me, he found himself recoiling from the idea of an attack unsanctioned by international law or by Congress. The American people seemed unenthusiastic about a Syria intervention; so too did one of the few foreign leaders Obama respects, Angela Merkel, the German chancellor. She told him that her country would not participate in a Syria campaign. And in a stunning development, on Thursday, August 29, the British Parliament denied David Cameron its blessing for an attack. John Kerry later told me that when he heard that, “internally, I went, Oops.”
  • Obama was also unsettled by a surprise visit early in the week from James Clapper, his director of national intelligence, who interrupted the President’s Daily Brief, the threat report Obama receives each morning from Clapper’s analysts, to make clear that the intelligence on Syria’s use of sarin gas, while robust, was not a “slam dunk.” He chose the term carefully. Clapper, the chief of an intelligence community traumatized by its failures in the run-up to the Iraq War, was not going to overpromise, in the manner of the onetime CIA director George Tenet, who famously guaranteed George W. Bush a “slam dunk” in Iraq.
  • While the Pentagon and the White House’s national-security apparatuses were still moving toward war (John Kerry told me he was expecting a strike the day after his speech), the president had come to believe that he was walking into a trap—one laid both by allies and by adversaries, and by conventional expectations of what an American president is supposed to do.
  • Late on Friday afternoon, Obama determined that he was simply not prepared to authorize a strike. He asked McDonough, his chief of staff, to take a walk with him on the South Lawn of the White House. Obama did not choose McDonough randomly: He is the Obama aide most averse to U.S. military intervention, and someone who, in the words of one of his colleagues, “thinks in terms of traps.” Obama, ordinarily a preternaturally confident man, was looking for validation, and trying to devise ways to explain his change of heart, both to his own aides and to the public
  • The third, and most important, factor, he told me, was “our assessment that while we could inflict some damage on Assad, we could not, through a missile strike, eliminate the chemical weapons themselves, and what I would then face was the prospect of Assad having survived the strike and claiming he had successfully defied the United States, that the United States had acted unlawfully in the absence of a UN mandate, and that that would have potentially strengthened his hand rather than weakened it.
  • Others had difficulty fathoming how the president could reverse himself the day before a planned strike. Obama, however, was completely calm. “If you’ve been around him, you know when he’s ambivalent about something, when it’s a 51–49 decision,” Ben Rhodes told me. “But he was completely at ease.”
  • Obama also shared with McDonough a long-standing resentment: He was tired of watching Washington unthinkingly drift toward war in Muslim countries. Four years earlier, the president believed, the Pentagon had “jammed” him on a troop surge for Afghanistan. Now, on Syria, he was beginning to feel jammed again.
  • The fourth factor, he said, was of deeper philosophical importance. “This falls in the category of something that I had been brooding on for some time,” he said. “I had come into office with the strong belief that the scope of executive power in national-security issues is very broad, but not limitless.”
  • Obama’s decision caused tremors across Washington as well. John McCain and Lindsey Graham, the two leading Republican hawks in the Senate, had met with Obama in the White House earlier in the week and had been promised an attack. They were angered by the about-face. Damage was done even inside the administration. Neither Chuck Hagel, then the secretary of defense, nor John Kerry was in the Oval Office when the president informed his team of his thinking. Kerry would not learn about the change until later that evening. “I just got fucked over,” he told a friend shortly after talking to the president that night. (When I asked Kerry recently about that tumultuous night, he said, “I didn’t stop to analyze it. I figured the president had a reason to make a decision and, honestly, I understood his notion.”)
  • The president asked Congress to authorize the use of force—the irrepressible Kerry served as chief lobbyist—and it quickly became apparent in the White House that Congress had little interest in a strike. When I spoke with Biden recently about the red-line decision, he made special note of this fact. “It matters to have Congress with you, in terms of your ability to sustain what you set out to do,” he said. Obama “didn’t go to Congress to get himself off the hook. He had his doubts at that point, but he knew that if he was going to do anything, he better damn well have the public with him, or it would be a very short ride.” Congress’s clear ambivalence convinced Biden that Obama was correct to fear the slippery slope. “What happens when we get a plane shot down? Do we not go in and rescue?,” Biden asked. “You need the support of the American people.”
  • At the G20 summit in St. Petersburg, which was held the week after the Syria reversal, Obama pulled Putin aside, he recalled to me, and told the Russian president “that if he forced Assad to get rid of the chemical weapons, that that would eliminate the need for us taking a military strike.” Within weeks, Kerry, working with his Russian counterpart, Sergey Lavrov, would engineer the removal of most of Syria’s chemical-weapons arsenal—a program whose existence Assad until then had refused to even acknowledge.
  • The arrangement won the president praise from, of all people, Benjamin Netanyahu, the Israeli prime minister, with whom he has had a consistently contentious relationship. The removal of Syria’s chemical-weapons stockpiles represented “the one ray of light in a very dark region,” Netanyahu told me not long after the deal was announced.
  • John Kerry today expresses no patience for those who argue, as he himself once did, that Obama should have bombed Assad-regime sites in order to buttress America’s deterrent capability. “You’d still have the weapons there, and you’d probably be fighting isil” for control of the weapons, he said, referring to the Islamic State, the terror group also known as isis. “It just doesn’t make sense. But I can’t deny to you that this notion about the red line being crossed and [Obama’s] not doing anything gained a life of its own.”
  • today that decision is a source of deep satisfaction for him.
  • “I’m very proud of this moment,” he told me. “The overwhelming weight of conventional wisdom and the machinery of our national-security apparatus had gone fairly far. The perception was that my credibility was at stake, that America’s credibility was at stake. And so for me to press the pause button at that moment, I knew, would cost me politically. And the fact that I was able to pull back from the immediate pressures and think through in my own mind what was in America’s interest, not only with respect to Syria but also with respect to our democracy, was as tough a decision as I’ve made—and I believe ultimately it was the right decision to make.”
  • By 2013, Obama’s resentments were well developed. He resented military leaders who believed they could fix any problem if the commander in chief would simply give them what they wanted, and he resented the foreign-policy think-tank complex. A widely held sentiment inside the White House is that many of the most prominent foreign-policy think tanks in Washington are doing the bidding of their Arab and pro-Israel funders. I’ve heard one administration official refer to Massachusetts Avenue, the home of many of these think tanks, as “Arab-occupied territory.”
  • over the past few months, I’ve spent several hours talking with him about the broadest themes of his “long game” foreign policy, including the themes he is most eager to discuss—namely, the ones that have nothing to do with the Middle East.
  • I have come to believe that, in Obama’s mind, August 30, 2013, was his liberation day, the day he defied not only the foreign-policy establishment and its cruise-missile playbook, but also the demands of America’s frustrating, high-maintenance allies in the Middle East—countries, he complains privately to friends and advisers, that seek to exploit American “muscle” for their own narrow and sectarian ends.
  • “Where am I controversial? When it comes to the use of military power,” he said. “That is the source of the controversy. There’s a playbook in Washington that presidents are supposed to follow. It’s a playbook that comes out of the foreign-policy establishment. And the playbook prescribes responses to different events, and these responses tend to be militarized responses. Where America is directly threatened, the playbook works. But the playbook can also be a trap that can lead to bad decisions. In the midst of an international challenge like Syria, you get judged harshly if you don’t follow the playbook, even if there are good reasons why it does not apply.”
  • For some foreign-policy experts, even within his own administration, Obama’s about-face on enforcing the red line was a dispiriting moment in which he displayed irresolution and naïveté, and did lasting damage to America’s standing in the world. “Once the commander in chief draws that red line,” Leon Panetta, who served as CIA director and then as secretary of defense in Obama’s first term, told me recently, “then I think the credibility of the commander in chief and this nation is at stake if he doesn’t enforce it.” Right after Obama’s reversal, Hillary Clinton said privately, “If you say you’re going to strike, you have to strike. There’s no choice.”
  • Obama’s defenders, however, argue that he did no damage to U.S. credibility, citing Assad’s subsequent agreement to have his chemical weapons removed. “The threat of force was credible enough for them to give up their chemical weapons,” Tim Kaine, a Democratic senator from Virginia, told me. “We threatened military action and they responded. That’s deterrent credibility.”
  • History may record August 30, 2013, as the day Obama prevented the U.S. from entering yet another disastrous Muslim civil war, and the day he removed the threat of a chemical attack on Israel, Turkey, or Jordan. Or it could be remembered as the day he let the Middle East slip from America’s grasp, into the hands of Russia, Iran, and isis
  • spoke with obama about foreign policy when he was a U.S. senator, in 2006. At the time, I was familiar mainly with the text of a speech he had delivered four years earlier, at a Chicago antiwar rally. It was an unusual speech for an antiwar rally in that it was not antiwar; Obama, who was then an Illinois state senator, argued only against one specific and, at the time, still theoretical, war. “I suffer no illusions about Saddam Hussein,” he said. “He is a brutal man. A ruthless man … But I also know that Saddam poses no imminent and direct threat to the United States or to his neighbors.” He added, “I know that an invasion of Iraq without a clear rationale and without strong international support will only fan the flames of the Middle East, and encourage the worst, rather than best, impulses of the Arab world, and strengthen the recruitment arm of al-Qaeda.”
  • This speech had made me curious about its author. I wanted to learn how an Illinois state senator, a part-time law professor who spent his days traveling between Chicago and Springfield, had come to a more prescient understanding of the coming quagmire than the most experienced foreign-policy thinkers of his party, including such figures as Hillary Clinton, Joe Biden, and John Kerry, not to mention, of course, most Republicans and many foreign-policy analysts and writers, including me.
  • This was the moment the president believes he finally broke with what he calls, derisively, the “Washington playbook.”
  • “isis is not an existential threat to the United States,” he told me in one of these conversations. “Climate change is a potential existential threat to the entire world if we don’t do something about it.” Obama explained that climate change worries him in particular because “it is a political problem perfectly designed to repel government intervention. It involves every single country, and it is a comparatively slow-moving emergency, so there is always something seemingly more urgent on the agenda.”
  • At the moment, of course, the most urgent of the “seemingly more urgent” issues is Syria. But at any given moment, Obama’s entire presidency could be upended by North Korean aggression, or an assault by Russia on a member of nato, or an isis-planned attack on U.S. soil. Few presidents have faced such diverse tests on the international stage as Obama has, and the challenge for him, as for all presidents, has been to distinguish the merely urgent from the truly important, and to focus on the important.
  • My goal in our recent conversations was to see the world through Obama’s eyes, and to understand what he believes America’s role in the world should be. This article is informed by our recent series of conversations, which took place in the Oval Office; over lunch in his dining room; aboard Air Force One; and in Kuala Lumpur during his most recent visit to Asia, in November. It is also informed by my previous interviews with him and by his speeches and prolific public ruminations, as well as by conversations with his top foreign-policy and national-security advisers, foreign leaders and their ambassadors in Washington, friends of the president and others who have spoken with him about his policies and decisions, and his adversaries and critics.
  • Over the course of our conversations, I came to see Obama as a president who has grown steadily more fatalistic about the constraints on America’s ability to direct global events, even as he has, late in his presidency, accumulated a set of potentially historic foreign-policy achievements—controversial, provisional achievements, to be sure, but achievements nonetheless: the opening to Cuba, the Paris climate-change accord, the Trans-Pacific Partnership trade agreement, and, of course, the Iran nuclear deal.
  • These he accomplished despite his growing sense that larger forces—the riptide of tribal feeling in a world that should have already shed its atavism; the resilience of small men who rule large countries in ways contrary to their own best interests; the persistence of fear as a governing human emotion—frequently conspire against the best of America’s intentions. But he also has come to learn, he told me, that very little is accomplished in international affairs without U.S. leadership.
  • Obama talked me through this apparent contradiction. “I want a president who has the sense that you can’t fix everything,” he said. But on the other hand, “if we don’t set the agenda, it doesn’t happen.” He explained what he meant. “The fact is, there is not a summit I’ve attended since I’ve been president where we are not setting the agenda, where we are not responsible for the key results,” he said. “That’s true whether you’re talking about nuclear security, whether you’re talking about saving the world financial system, whether you’re talking about climate.”
  • One day, over lunch in the Oval Office dining room, I asked the president how he thought his foreign policy might be understood by historians. He started by describing for me a four-box grid representing the main schools of American foreign-policy thought. One box he called isolationism, which he dismissed out of hand. “The world is ever-shrinking,” he said. “Withdrawal is untenable.” The other boxes he labeled realism, liberal interventionism, and internationalism. “I suppose you could call me a realist in believing we can’t, at any given moment, relieve all the world’s misery,” he said. “We have to choose where we can make a real impact.” He also noted that he was quite obviously an internationalist, devoted as he is to strengthening multilateral organizations and international norms.
  • If a crisis, or a humanitarian catastrophe, does not meet his stringent standard for what constitutes a direct national-security threat, Obama said, he doesn’t believe that he should be forced into silence. He is not so much the realist, he suggested, that he won’t pass judgment on other leaders.
  • Though he has so far ruled out the use of direct American power to depose Assad, he was not wrong, he argued, to call on Assad to go. “Oftentimes when you get critics of our Syria policy, one of the things that they’ll point out is ‘You called for Assad to go, but you didn’t force him to go. You did not invade.’ And the notion is that if you weren’t going to overthrow the regime, you shouldn’t have said anything. That’s a weird argument to me, the notion that if we use our moral authority to say ‘This is a brutal regime, and this is not how a leader should treat his people,’ once you do that, you are obliged to invade the country and install a government you prefer.”
  • “I am very much the internationalist,” Obama said in a later conversation. “And I am also an idealist insofar as I believe that we should be promoting values, like democracy and human rights and norms and values
  • “Having said that,” he continued, “I also believe that the world is a tough, complicated, messy, mean place, and full of hardship and tragedy. And in order to advance both our security interests and those ideals and values that we care about, we’ve got to be hardheaded at the same time as we’re bighearted, and pick and choose our spots, and recognize that there are going to be times where the best that we can do is to shine a spotlight on something that’s terrible, but not believe that we can automatically solve it. There are going to be times where our security interests conflict with our concerns about human rights. There are going to be times where we can do something about innocent people being killed, but there are going to be times where we can’t.”
  • If Obama ever questioned whether America really is the world’s one indispensable nation, he no longer does so. But he is the rare president who seems at times to resent indispensability, rather than embrace it.
  • “Free riders aggravate me,” he told me. Recently, Obama warned that Great Britain would no longer be able to claim a “special relationship” with the United States if it did not commit to spending at least 2 percent of its GDP on defense. “You have to pay your fair share,” Obama told David Cameron, who subsequently met the 2 percent threshold.
  • Part of his mission as president, Obama explained, is to spur other countries to take action for themselves, rather than wait for the U.S. to lead. The defense of the liberal international order against jihadist terror, Russian adventurism, and Chinese bullying depends in part, he believes, on the willingness of other nations to share the burden with the U.S
  • This is why the controversy surrounding the assertion—made by an anonymous administration official to The New Yorker during the Libya crisis of 2011—that his policy consisted of “leading from behind” perturbed him. “We don’t have to always be the ones who are up front,” he told me. “Sometimes we’re going to get what we want precisely because we are sharing in the agenda.
  • The president also seems to believe that sharing leadership with other countries is a way to check America’s more unruly impulses. “One of the reasons I am so focused on taking action multilaterally where our direct interests are not at stake is that multilateralism regulates hubris,”
  • He consistently invokes what he understands to be America’s past failures overseas as a means of checking American self-righteousness. “We have history,” he said. “We have history in Iran, we have history in Indonesia and Central America. So we have to be mindful of our history when we start talking about intervening, and understand the source of other people’s suspicions.”
  • In his efforts to off-load some of America’s foreign-policy responsibilities to its allies, Obama appears to be a classic retrenchment president in the manner of Dwight D. Eisenhower and Richard Nixon. Retrenchment, in this context, is defined as “pulling back, spending less, cutting risk, and shifting burdens to allies
  • One difference between Eisenhower and Nixon, on the one hand, and Obama, on the other, Sestanovich said, is that Obama “appears to have had a personal, ideological commitment to the idea that foreign policy had consumed too much of the nation’s attention and resources.”
  • But once he decides that a particular challenge represents a direct national-security threat, he has shown a willingness to act unilaterally. This is one of the larger ironies of the Obama presidency: He has relentlessly questioned the efficacy of force, but he has also become the most successful terrorist-hunter in the history of the presidency, one who will hand to his successor a set of tools an accomplished assassin would envy
  • “He applies different standards to direct threats to the U.S.,” Ben Rhodes says. “For instance, despite his misgivings about Syria, he has not had a second thought about drones.” Some critics argue he should have had a few second thoughts about what they see as the overuse of drones. But John Brennan, Obama’s CIA director, told me recently that he and the president “have similar views. One of them is that sometimes you have to take a life to save even more lives. We have a similar view of just-war theory. The president requires near-certainty of no collateral damage. But if he believes it is necessary to act, he doesn’t hesitate.”
  • Those who speak with Obama about jihadist thought say that he possesses a no-illusions understanding of the forces that drive apocalyptic violence among radical Muslims, but he has been careful about articulating that publicly, out of concern that he will exacerbate anti-Muslim xenophobia
  • He has a tragic realist’s understanding of sin, cowardice, and corruption, and a Hobbesian appreciation of how fear shapes human behavior. And yet he consistently, and with apparent sincerity, professes optimism that the world is bending toward justice. He is, in a way, a Hobbesian optimist.
  • The contradictions do not end there. Though he has a reputation for prudence, he has also been eager to question some of the long-standing assumptions undergirding traditional U.S. foreign-policy thinking. To a remarkable degree, he is willing to question why America’s enemies are its enemies, or why some of its friends are its friends.
  • It is assumed, at least among his critics, that Obama sought the Iran deal because he has a vision of a historic American-Persian rapprochement. But his desire for the nuclear agreement was born of pessimism as much as it was of optimism. “The Iran deal was never primarily about trying to open a new era of relations between the U.S. and Iran,” Susan Rice told me. “It was far more pragmatic and minimalist. The aim was very simply to make a dangerous country substantially less dangerous. No one had any expectation that Iran would be a more benign actor.”
  • once mentioned to obama a scene from The Godfather: Part III, in which Michael Corleone complains angrily about his failure to escape the grasp of organized crime. I told Obama that the Middle East is to his presidency what the Mob is to Corleone, and I started to quote the Al Pacino line: “Just when I thought I was out—”“It pulls you back in,” Obama said, completing the thought
  • When I asked Obama recently what he had hoped to accomplish with his Cairo reset speech, he said that he had been trying—unsuccessfully, he acknowledged—to persuade Muslims to more closely examine the roots of their unhappiness.“My argument was this: Let’s all stop pretending that the cause of the Middle East’s problems is Israel,” he told me. “We want to work to help achieve statehood and dignity for the Palestinians, but I was hoping that my speech could trigger a discussion, could create space for Muslims to address the real problems they are confronting—problems of governance, and the fact that some currents of Islam have not gone through a reformation that would help people adapt their religious doctrines to modernity. My thought was, I would communicate that the U.S. is not standing in the way of this progress, that we would help, in whatever way possible, to advance the goals of a practical, successful Arab agenda that provided a better life for ordinary people.”
  • But over the next three years, as the Arab Spring gave up its early promise, and brutality and dysfunction overwhelmed the Middle East, the president grew disillusioned. Some of his deepest disappointments concern Middle Eastern leaders themselves. Benjamin Netanyahu is in his own category: Obama has long believed that Netanyahu could bring about a two-state solution that would protect Israel’s status as a Jewish-majority democracy, but is too fearful and politically paralyzed to do so
  • Obama has also not had much patience for Netanyahu and other Middle Eastern leaders who question his understanding of the region. In one of Netanyahu’s meetings with the president, the Israeli prime minister launched into something of a lecture about the dangers of the brutal region in which he lives, and Obama felt that Netanyahu was behaving in a condescending fashion, and was also avoiding the subject at hand: peace negotiations. Finally, the president interrupted the prime minister: “Bibi, you have to understand something,” he said. “I’m the African American son of a single mother, and I live here, in this house. I live in the White House. I managed to get elected president of the United States. You think I don’t understand what you’re talking about, but I do.”
  • Other leaders also frustrate him immensely. Early on, Obama saw Recep Tayyip Erdoğan, the president of Turkey, as the sort of moderate Muslim leader who would bridge the divide between East and West—but Obama now considers him a failure and an authoritarian, one who refuses to use his enormous army to bring stability to Syria
  • In recent days, the president has taken to joking privately, “All I need in the Middle East is a few smart autocrats.” Obama has always had a fondness for pragmatic, emotionally contained technocrats, telling aides, “If only everyone could be like the Scandinavians, this would all be easy.”
  • The unraveling of the Arab Spring darkened the president’s view of what the U.S. could achieve in the Middle East, and made him realize how much the chaos there was distracting from other priorities. “The president recognized during the course of the Arab Spring that the Middle East was consuming us,”
  • But what sealed Obama’s fatalistic view was the failure of his administration’s intervention in Libya, in 2011
  • Obama says today of the intervention, “It didn’t work.” The U.S., he believes, planned the Libya operation carefully—and yet the country is still a disaster.
  • “So we actually executed this plan as well as I could have expected: We got a UN mandate, we built a coalition, it cost us $1 billion—which, when it comes to military operations, is very cheap. We averted large-scale civilian casualties, we prevented what almost surely would have been a prolonged and bloody civil conflict. And despite all that, Libya is a mess.”
  • Mess is the president’s diplomatic term; privately, he calls Libya a “shit show,” in part because it’s subsequently become an isis haven—one that he has already targeted with air strikes. It became a shit show, Obama believes, for reasons that had less to do with American incompetence than with the passivity of America’s allies and with the obdurate power of tribalism.
  • Of France, he said, “Sarkozy wanted to trumpet the flights he was taking in the air campaign, despite the fact that we had wiped out all the air defenses and essentially set up the entire infrastructure” for the intervention. This sort of bragging was fine, Obama said, because it allowed the U.S. to “purchase France’s involvement in a way that made it less expensive for us and less risky for us.” In other words, giving France extra credit in exchange for less risk and cost to the United States was a useful trade-off—except that “from the perspective of a lot of the folks in the foreign-policy establishment, well, that was terrible. If we’re going to do something, obviously we’ve got to be up front, and nobody else is sharing in the spotlight.”
  • Obama also blamed internal Libyan dynamics. “The degree of tribal division in Libya was greater than our analysts had expected. And our ability to have any kind of structure there that we could interact with and start training and start providing resources broke down very quickly.”
  • Libya proved to him that the Middle East was best avoided. “There is no way we should commit to governing the Middle East and North Africa,” he recently told a former colleague from the Senate. “That would be a basic, fundamental mistake.”
  • Obama did not come into office preoccupied by the Middle East. He is the first child of the Pacific to become president—born in Hawaii, raised there and, for four years, in Indonesia—and he is fixated on turning America’s attention to Asia
  • For Obama, Asia represents the future. Africa and Latin America, in his view, deserve far more U.S. attention than they receive. Europe, about which he is unromantic, is a source of global stability that requires, to his occasional annoyance, American hand-holding. And the Middle East is a region to be avoided—one that, thanks to America’s energy revolution, will soon be of negligible relevance to the U.S. economy.
  • Advisers recall that Obama would cite a pivotal moment in The Dark Knight, the 2008 Batman movie, to help explain not only how he understood the role of isis, but how he understood the larger ecosystem in which it grew. “There’s a scene in the beginning in which the gang leaders of Gotham are meeting,” the president would say. “These are men who had the city divided up. They were thugs, but there was a kind of order. Everyone had his turf. And then the Joker comes in and lights the whole city on fire. isil is the Joker. It has the capacity to set the whole region on fire. That’s why we have to fight it.”
  • The rise of the Islamic State deepened Obama’s conviction that the Middle East could not be fixed—not on his watch, and not for a generation to come.
  • The traveling White House press corps was unrelenting: “Isn’t it time for your strategy to change?” one reporter asked. This was followed by “Could I ask you to address your critics who say that your reluctance to enter another Middle East war, and your preference of diplomacy over using the military, makes the United States weaker and emboldens our enemies?” And then came this imperishable question, from a CNN reporter: “If you’ll forgive the language—why can’t we take out these bastards?” Which was followed by “Do you think you really understand this enemy well enough to defeat them and to protect the homeland?”
  • This rhetoric appeared to frustrate Obama immensely. “When I hear folks say that, well, maybe we should just admit the Christians but not the Muslims; when I hear political leaders suggesting that there would be a religious test for which person who’s fleeing from a war-torn country is admitted,” Obama told the assembled reporters, “that’s not American. That’s not who we are. We don’t have religious tests to our compassion.”
  • he has never believed that terrorism poses a threat to America commensurate with the fear it generates. Even during the period in 2014 when isis was executing its American captives in Syria, his emotions were in check. Valerie Jarrett, Obama’s closest adviser, told him people were worried that the group would soon take its beheading campaign to the U.S. “They’re not coming here to chop our heads off,” he reassured her.
  • Obama frequently reminds his staff that terrorism takes far fewer lives in America than handguns, car accidents, and falls in bathtubs do
  • Several years ago, he expressed to me his admiration for Israelis’ “resilience” in the face of constant terrorism, and it is clear that he would like to see resilience replace panic in American society. Nevertheless, his advisers are fighting a constant rearguard action to keep Obama from placing terrorism in what he considers its “proper” perspective, out of concern that he will seem insensitive to the fears of the American people.
  • When I noted to Kerry that the president’s rhetoric doesn’t match his, he said, “President Obama sees all of this, but he doesn’t gin it up into this kind of—he thinks we are on track. He has escalated his efforts. But he’s not trying to create hysteria … I think the president is always inclined to try to keep things on an appropriate equilibrium. I respect that.”
  • Obama modulates his discussion of terrorism for several reasons: He is, by nature, Spockian. And he believes that a misplaced word, or a frightened look, or an ill-considered hyperbolic claim, could tip the country into panic. The sort of panic he worries about most is the type that would manifest itself in anti-Muslim xenophobia or in a challenge to American openness and to the constitutional order.
  • The president also gets frustrated that terrorism keeps swamping his larger agenda, particularly as it relates to rebalancing America’s global priorities. For years, the “pivot to Asia” has been a paramount priority of his. America’s economic future lies in Asia, he believes, and the challenge posed by China’s rise requires constant attention. From his earliest days in office, Obama has been focused on rebuilding the sometimes-threadbare ties between the U.S. and its Asian treaty partners, and he is perpetually on the hunt for opportunities to draw other Asian nations into the U.S. orbit. His dramatic opening to Burma was one such opportunity; Vietnam and the entire constellation of Southeast Asian countries fearful of Chinese domination presented others.
  • Obama believes, Carter said, that Asia “is the part of the world of greatest consequence to the American future, and that no president can take his eye off of this.” He added, “He consistently asks, even in the midst of everything else that’s going on, ‘Where are we in the Asia-Pacific rebalance? Where are we in terms of resources?’ He’s been extremely consistent about that, even in times of Middle East tension.”
  • “Right now, I don’t think that anybody can be feeling good about the situation in the Middle East,” he said. “You have countries that are failing to provide prosperity and opportunity for their people. You’ve got a violent, extremist ideology, or ideologies, that are turbocharged through social media. You’ve got countries that have very few civic traditions, so that as autocratic regimes start fraying, the only organizing principles are sectarian.”
  • He went on, “Contrast that with Southeast Asia, which still has huge problems—enormous poverty, corruption—but is filled with striving, ambitious, energetic people who are every single day scratching and clawing to build businesses and get education and find jobs and build infrastructure. The contrast is pretty stark.”
  • In Asia, as well as in Latin America and Africa, Obama says, he sees young people yearning for self-improvement, modernity, education, and material wealth.“They are not thinking about how to kill Americans,” he says. “What they’re thinking about is How do I get a better education? How do I create something of value?”
  • He then made an observation that I came to realize was representative of his bleakest, most visceral understanding of the Middle East today—not the sort of understanding that a White House still oriented around themes of hope and change might choose to advertise. “If we’re not talking to them,” he said, referring to young Asians and Africans and Latin Americans, “because the only thing we’re doing is figuring out how to destroy or cordon off or control the malicious, nihilistic, violent parts of humanity, then we’re missing the boat.
  • He does resist refracting radical Islam through the “clash of civilizations” prism popularized by the late political scientist Samuel Huntington. But this is because, he and his advisers argue, he does not want to enlarge the ranks of the enemy. “The goal is not to force a Huntington template onto this conflict,” said John Brennan, the CIA director.
  • “It is very clear what I mean,” he told me, “which is that there is a violent, radical, fanatical, nihilistic interpretation of Islam by a faction—a tiny faction—within the Muslim community that is our enemy, and that has to be defeated.”
  • “There is also the need for Islam as a whole to challenge that interpretation of Islam, to isolate it, and to undergo a vigorous discussion within their community about how Islam works as part of a peaceful, modern society,” he said. But he added, “I do not persuade peaceful, tolerant Muslims to engage in that debate if I’m not sensitive to their concern that they are being tagged with a broad brush.”
  • In private encounters with other world leaders, Obama has argued that there will be no comprehensive solution to Islamist terrorism until Islam reconciles itself to modernity and undergoes some of the reforms that have changed Christianity.
  • , Obama described how he has watched Indonesia gradually move from a relaxed, syncretistic Islam to a more fundamentalist, unforgiving interpretation; large numbers of Indonesian women, he observed, have now adopted the hijab, the Muslim head covering.
  • Why, Turnbull asked, was this happening?Because, Obama answered, the Saudis and other Gulf Arabs have funneled money, and large numbers of imams and teachers, into the country. In the 1990s, the Saudis heavily funded Wahhabist madrassas, seminaries that teach the fundamentalist version of Islam favored by the Saudi ruling family, Obama told Turnbull. Today, Islam in Indonesia is much more Arab in orientation than it was when he lived there, he said.
  • “Aren’t the Saudis your friends?,” Turnbull asked.Obama smiled. “It’s complicated,” he said.
  • But he went on to say that the Saudis need to “share” the Middle East with their Iranian foes. “The competition between the Saudis and the Iranians—which has helped to feed proxy wars and chaos in Syria and Iraq and Yemen—requires us to say to our friends as well as to the Iranians that they need to find an effective way to share the neighborhood and institute some sort of cold peace,”
  • “An approach that said to our friends ‘You are right, Iran is the source of all problems, and we will support you in dealing with Iran’ would essentially mean that as these sectarian conflicts continue to rage and our Gulf partners, our traditional friends, do not have the ability to put out the flames on their own or decisively win on their own, and would mean that we have to start coming in and using our military power to settle scores. And that would be in the interest neither of the United States nor of the Middle East.”
  • One of the most destructive forces in the Middle East, Obama believes, is tribalism—a force no president can neutralize. Tribalism, made manifest in the reversion to sect, creed, clan, and village by the desperate citizens of failing states, is the source of much of the Muslim Middle East’s problems, and it is another source of his fatalism. Obama has deep respect for the destructive resilience of tribalism—part of his memoir, Dreams From My Father, concerns the way in which tribalism in post-colonial Kenya helped ruin his father’s life—which goes some distance in explaining why he is so fastidious about avoiding entanglements in tribal conflicts.
  • “It is literally in my DNA to be suspicious of tribalism,” he told me. “I understand the tribal impulse, and acknowledge the power of tribal division. I’ve been navigating tribal divisions my whole life. In the end, it’s the source of a lot of destructive acts.”
  • “Look, I am not of the view that human beings are inherently evil,” he said. “I believe that there’s more good than bad in humanity. And if you look at the trajectory of history, I am optimistic.
  • “I believe that overall, humanity has become less violent, more tolerant, healthier, better fed, more empathetic, more able to manage difference. But it’s hugely uneven. And what has been clear throughout the 20th and 21st centuries is that the progress we make in social order and taming our baser impulses and steadying our fears can be reversed very quickly. Social order starts breaking down if people are under profound stress. Then the default position is tribe—us/them, a hostility toward the unfamiliar or the unknown.”
  • He continued, “Right now, across the globe, you’re seeing places that are undergoing severe stress because of globalization, because of the collision of cultures brought about by the Internet and social media, because of scarcities—some of which will be attributable to climate change over the next several decades—because of population growth. And in those places, the Middle East being Exhibit A, the default position for a lot of folks is to organize tightly in the tribe and to push back or strike out against those who are different.
  • “A group like isil is the distillation of every worst impulse along these lines. The notion that we are a small group that defines ourselves primarily by the degree to which we can kill others who are not like us, and attempting to impose a rigid orthodoxy that produces nothing, that celebrates nothing, that really is contrary to every bit of human progress—it indicates the degree to which that kind of mentality can still take root and gain adherents in the 21st century.”
  • “We have to determine the best tools to roll back those kinds of attitudes,” he said. “There are going to be times where either because it’s not a direct threat to us or because we just don’t have the tools in our toolkit to have a huge impact that, tragically, we have to refrain from jumping in with both feet.”
  • I asked Obama whether he would have sent the Marines to Rwanda in 1994 to stop the genocide as it was happening, had he been president at the time. “Given the speed with which the killing took place, and how long it takes to crank up the machinery of the U.S. government, I understand why we did not act fast enough,” he said. “Now, we should learn from tha
  • I actually think that Rwanda is an interesting test case because it’s possible—not guaranteed, but it’s possible—that this was a situation where the quick application of force might have been enough.
  • “Ironically, it’s probably easier to make an argument that a relatively small force inserted quickly with international support would have resulted in averting genocide [more successfully in Rwanda] than in Syria right now, where the degree to which the various groups are armed and hardened fighters and are supported by a whole host of external actors with a lot of resources requires a much larger commitment of forces.”
  • The Turkey press conference, I told him, “was a moment for you as a politician to say, ‘Yeah, I hate the bastards too, and by the way, I am taking out the bastards.’ ” The easy thing to do would have been to reassure Americans in visceral terms that he will kill the people who want to kill them. Does he fear a knee-jerk reaction in the direction of another Middle East invasion? Or is he just inalterably Spockian?
  • “Every president has strengths and weaknesses,” he answered. “And there is no doubt that there are times where I have not been attentive enough to feelings and emotions and politics in communicating what we’re doing and how we’re doing it.”
  • But for America to be successful in leading the world, he continued, “I believe that we have to avoid being simplistic. I think we have to build resilience and make sure that our political debates are grounded in reality. It’s not that I don’t appreciate the value of theater in political communications; it’s that the habits we—the media, politicians—have gotten into, and how we talk about these issues, are so detached so often from what we need to be doing that for me to satisfy the cable news hype-fest would lead to us making worse and worse decisions over time.”
  • “During the couple of months in which everybody was sure Ebola was going to destroy the Earth and there was 24/7 coverage of Ebola, if I had fed the panic or in any way strayed from ‘Here are the facts, here’s what needs to be done, here’s how we’re handling it, the likelihood of you getting Ebola is very slim, and here’s what we need to do both domestically and overseas to stamp out this epidemic,’ ” then “maybe people would have said ‘Obama is taking this as seriously as he needs to be.’ ” But feeding the panic by overreacting could have shut down travel to and from three African countries that were already cripplingly poor, in ways that might have destroyed their economies—which would likely have meant, among other things, a recurrence of Ebola. He added, “It would have also meant that we might have wasted a huge amount of resources in our public-health systems that need to be devoted to flu vaccinations and other things that actually kill people” in large numbers in America
  • “I have friends who have kids in Paris right now,” he said. “And you and I and a whole bunch of people who are writing about what happened in Paris have strolled along the same streets where people were gunned down. And it’s right to feel fearful. And it’s important for us not to ever get complacent. There’s a difference between resilience and complacency.” He went on to describe another difference—between making considered decisions and making rash, emotional ones. “What it means, actually, is that you care so much that you want to get it right and you’re not going to indulge in either impetuous or, in some cases, manufactured responses that make good sound bites but don’t produce results. The stakes are too high to play those games.”
  • The other meeting took place two months later, in the Oval Office, between Obama and the general secretary of the Vietnamese Communist Party, Nguyen Phu Trong. This meeting took place only because John Kerry had pushed the White House to violate protocol, since the general secretary was not a head of state. But the goals trumped decorum: Obama wanted to lobby the Vietnamese on the Trans-Pacific Partnership—his negotiators soon extracted a promise from the Vietnamese that they would legalize independent labor unions—and he wanted to deepen cooperation on strategic issues. Administration officials have repeatedly hinted to me that Vietnam may one day soon host a permanent U.S. military presence, to check the ambitions of the country it now fears most, China. The U.S. Navy’s return to Cam Ranh Bay would count as one of the more improbable developments in recent American history. “We just moved the Vietnamese Communist Party to recognize labor rights in a way that we could never do by bullying them or scaring them,” Obama told me, calling this a key victory in his campaign to replace stick-waving with diplomatic persuasion.
  • I noted that the 200 or so young Southeast Asians in the room earlier that day—including citizens of Communist-ruled countries—seemed to love America. “They do,” Obama said. “In Vietnam right now, America polls at 80 percent.”
  • The resurgent popularity of America throughout Southeast Asia means that “we can do really big, important stuff—which, by the way, then has ramifications across the board,” he said, “because when Malaysia joins the anti-isil campaign, that helps us leverage resources and credibility in our fight against terrorism. When we have strong relations with Indonesia, that helps us when we are going to Paris and trying to negotiate a climate treaty, where the temptation of a Russia or some of these other countries may be to skew the deal in a way that is unhelpful.
  • Obama then cited America’s increased influence in Latin America—increased, he said, in part by his removal of a region-wide stumbling block when he reestablished ties with Cuba—as proof that his deliberate, nonthreatening, diplomacy-centered approach to foreign relations is working. The alba movement, a group of Latin American governments oriented around anti-Americanism, has significantly weakened during his time as president. “When I came into office, at the first Summit of the Americas that I attended, Hugo Chávez”—the late anti-American Venezuelan dictator—“was still the dominant figure in the conversation,” he said. “We made a very strategic decision early on, which was, rather than blow him up as this 10-foot giant adversary, to right-size the problem and say, ‘We don’t like what’s going on in Venezuela, but it’s not a threat to the United States.’
  • Obama said that to achieve this rebalancing, the U.S. had to absorb the diatribes and insults of superannuated Castro manqués. “When I saw Chávez, I shook his hand and he handed me a Marxist critique of the U.S.–Latin America relationship,” Obama recalled. “And I had to sit there and listen to Ortega”—Daniel Ortega, the radical leftist president of Nicaragua—“make an hour-long rant against the United States. But us being there, not taking all that stuff seriously—because it really wasn’t a threat to us”—helped neutralize the region’s anti-Americanism.
  • “The truth is, actually, Putin, in all of our meetings, is scrupulously polite, very frank. Our meetings are very businesslike. He never keeps me waiting two hours like he does a bunch of these other folks.” Obama said that Putin believes his relationship with the U.S. is more important than Americans tend to think. “He’s constantly interested in being seen as our peer and as working with us, because he’s not completely stupid. He understands that Russia’s overall position in the world is significantly diminished. And the fact that he invades Crimea or is trying to prop up Assad doesn’t suddenly make him a player.
  • “The argument is made,” I said, “that Vladimir Putin watched you in Syria and thought, He’s too logical, he’s too rational, he’s too into retrenchment. I’m going to push him a little bit further in Ukraine.”
  • “Look, this theory is so easily disposed of that I’m always puzzled by how people make the argument. I don’t think anybody thought that George W. Bush was overly rational or cautious in his use of military force. And as I recall, because apparently nobody in this town does, Putin went into Georgia on Bush’s watch, right smack dab in the middle of us having over 100,000 troops deployed in Iraq.” Obama was referring to Putin’s 2008 invasion of Georgia, a former Soviet republic, which was undertaken for many of the same reasons Putin later invaded Ukraine—to keep an ex–Soviet republic in Russia’s sphere of influence.
  • “Putin acted in Ukraine in response to a client state that was about to slip out of his grasp. And he improvised in a way to hang on to his control there,” he said. “He’s done the exact same thing in Syria, at enormous cost to the well-being of his own country. And the notion that somehow Russia is in a stronger position now, in Syria or in Ukraine, than they were before they invaded Ukraine or before he had to deploy military forces to Syria is to fundamentally misunderstand the nature of power in foreign affairs or in the world generally. Real power means you can get what you want without having to exert violence. Russia was much more powerful when Ukraine looked like an independent country but was a kleptocracy that he could pull the strings on.”
  • Obama’s theory here is simple: Ukraine is a core Russian interest but not an American one, so Russia will always be able to maintain escalatory dominance there.“The fact is that Ukraine, which is a non-nato country, is going to be vulnerable to military domination by Russia no matter what we do,” he said.
  • “I think that the best argument you can make on the side of those who are critics of my foreign policy is that the president doesn’t exploit ambiguity enough. He doesn’t maybe react in ways that might cause people to think, Wow, this guy might be a little crazy.”“The ‘crazy Nixon’ approach,” I said: Confuse and frighten your enemies by making them think you’re capable of committing irrational acts.
  • “But let’s examine the Nixon theory,” he said. “So we dropped more ordnance on Cambodia and Laos than on Europe in World War II, and yet, ultimately, Nixon withdrew, Kissinger went to Paris, and all we left behind was chaos, slaughter, and authoritarian governments
  • “There is no evidence in modern American foreign policy that that’s how people respond. People respond based on what their imperatives are, and if it’s really important to somebody, and it’s not that important to us, they know that, and we know that,” he said. “There are ways to deter, but it requires you to be very clear ahead of time about what is worth going to war for and what is not.
  • Now, if there is somebody in this town that would claim that we would consider going to war with Russia over Crimea and eastern Ukraine, they should speak up and be very clear about it. The idea that talking tough or engaging in some military action that is tangential to that particular area is somehow going to influence the decision making of Russia or China is contrary to all the evidence we have seen over the last 50 years.”
  • “If you think about, let’s say, the Iran hostage crisis, there is a narrative that has been promoted today by some of the Republican candidates that the day Reagan was elected, because he looked tough, the Iranians decided, ‘We better turn over these hostages,’ ” he said. “In fact what had happened was that there was a long negotiation with the Iranians and because they so disliked Carter—even though the negotiations had been completed—they held those hostages until the day Reagan got elected
  • When you think of the military actions that Reagan took, you have Grenada—which is hard to argue helped our ability to shape world events, although it was good politics for him back home. You have the Iran-Contra affair, in which we supported right-wing paramilitaries and did nothing to enhance our image in Central America, and it wasn’t successful at all.” He reminded me that Reagan’s great foe, Daniel Ortega, is today the unrepentant president of Nicaragua.
  • Obama also cited Reagan’s decision to almost immediately pull U.S. forces from Lebanon after 241 servicemen were killed in a Hezbollah attack in 1983. “Apparently all these things really helped us gain credibility with the Russians and the Chinese,” because “that’s the narrative that is told,” he said sarcastically.
  • “Now, I actually think that Ronald Reagan had a great success in foreign policy, which was to recognize the opportunity that Gorbachev presented and to engage in extensive diplomacy—which was roundly criticized by some of the same people who now use Ronald Reagan to promote the notion that we should go around bombing people.”
  • “As I survey the next 20 years, climate change worries me profoundly because of the effects that it has on all the other problems that we face,” he said. “If you start seeing more severe drought; more significant famine; more displacement from the Indian subcontinent and coastal regions in Africa and Asia; the continuing problems of scarcity, refugees, poverty, disease—this makes every other problem we’ve got worse. That’s above and beyond just the existential issues of a planet that starts getting into a bad feedback loop.”
  • Terrorism, he said, is also a long-term problem “when combined with the problem of failed states.”
  • What country does he consider the greatest challenge to America in the coming decades? “In terms of traditional great-state relations, I do believe that the relationship between the United States and China is going to be the most critical,” he said. “If we get that right and China continues on a peaceful rise, then we have a partner that is growing in capability and sharing with us the burdens and responsibilities of maintaining an international order. If China fails; if it is not able to maintain a trajectory that satisfies its population and has to resort to nationalism as an organizing principle; if it feels so overwhelmed that it never takes on the responsibilities of a country its size in maintaining the international order; if it views the world only in terms of regional spheres of influence—then not only do we see the potential for conflict with China, but we will find ourselves having more difficulty dealing with these other challenges that are going to come.”
  • I’ve been very explicit in saying that we have more to fear from a weakened, threatened China than a successful, rising China,” Obama said. “I think we have to be firm where China’s actions are undermining international interests, and if you look at how we’ve operated in the South China Sea, we have been able to mobilize most of Asia to isolate China in ways that have surprised China, frankly, and have very much served our interest in strengthening our alliances.”
  • A weak, flailing Russia constitutes a threat as well, though not quite a top-tier threat. “Unlike China, they have demographic problems, economic structural problems, that would require not only vision but a generation to overcome,” Obama said. “The path that Putin is taking is not going to help them overcome those challenges. But in that environment, the temptation to project military force to show greatness is strong, and that’s what Putin’s inclination is. So I don’t underestimate the dangers there.”
  • “You know, the notion that diplomacy and technocrats and bureaucrats somehow are helping to keep America safe and secure, most people think, Eh, that’s nonsense. But it’s true. And by the way, it’s the element of American power that the rest of the world appreciates unambiguously
  • When we deploy troops, there’s always a sense on the part of other countries that, even where necessary, sovereignty is being violated.”
  • Administration officials have told me that Vice President Biden, too, has become frustrated with Kerry’s demands for action. He has said privately to the secretary of state, “John, remember Vietnam? Remember how that started?” At a National Security Council meeting held at the Pentagon in December, Obama announced that no one except the secretary of defense should bring him proposals for military action. Pentagon officials understood Obama’s announcement to be a brushback pitch directed at Kerry.
  • Obama’s caution on Syria has vexed those in the administration who have seen opportunities, at different moments over the past four years, to tilt the battlefield against Assad. Some thought that Putin’s decision to fight on behalf of Assad would prompt Obama to intensify American efforts to help anti-regime rebels. But Obama, at least as of this writing, would not be moved, in part because he believed that it was not his business to stop Russia from making what he thought was a terrible mistake. “They are overextended. They’re bleeding,” he told me. “And their economy has contracted for three years in a row, drastically.
  • Obama’s strategy was occasionally referred to as the “Tom Sawyer approach.” Obama’s view was that if Putin wanted to expend his regime’s resources by painting the fence in Syria, the U.S. should let him.
  • By late winter, though, when it appeared that Russia was making advances in its campaign to solidify Assad’s rule, the White House began discussing ways to deepen support for the rebels, though the president’s ambivalence about more-extensive engagement remained. In conversations I had with National Security Council officials over the past couple of months, I sensed a foreboding that an event—another San Bernardino–style attack, for instance—would compel the United States to take new and direct action in Syria. For Obama, this would be a nightmare.
  • If there had been no Iraq, no Afghanistan, and no Libya, Obama told me, he might be more apt to take risks in Syria. “A president does not make decisions in a vacuum. He does not have a blank slate. Any president who was thoughtful, I believe, would recognize that after over a decade of war, with obligations that are still to this day requiring great amounts of resources and attention in Afghanistan, with the experience of Iraq, with the strains that it’s placed on our military—any thoughtful president would hesitate about making a renewed commitment in the exact same region of the world with some of the exact same dynamics and the same probability of an unsatisfactory outcome.”
  • What has struck me is that, even as his secretary of state warns about a dire, Syria-fueled European apocalypse, Obama has not recategorized the country’s civil war as a top-tier security threat.
  • This critique frustrates the president. “Nobody remembers bin Laden anymore,” he says. “Nobody talks about me ordering 30,000 more troops into Afghanistan.” The red-line crisis, he said, “is the point of the inverted pyramid upon which all other theories rest.
  • “Was it a bluff?” I told him that few people now believe he actually would have attacked Iran to keep it from getting a nuclear weapon.“That’s interesting,” he said, noncommittally.I started to talk: “Do you—”He interrupted. “I actually would have,” he said, meaning that he would have struck Iran’s nuclear facilities. “If I saw them break out.”
  • “You were right to believe it,” the president said. And then he made his key point. “This was in the category of an American interest.”
  • I was reminded then of something Derek Chollet, a former National Security Council official, told me: “Obama is a gambler, not a bluffer.”
  • The president has placed some huge bets. Last May, as he was trying to move the Iran nuclear deal through Congress, I told him that the agreement was making me nervous. His response was telling. “Look, 20 years from now, I’m still going to be around, God willing. If Iran has a nuclear weapon, it’s my name on this,” he said. “I think it’s fair to say that in addition to our profound national-security interests, I have a personal interest in locking this down.”
  • In the matter of the Syrian regime and its Iranian and Russian sponsors, Obama has bet, and seems prepared to continue betting, that the price of direct U.S. action would be higher than the price of inaction. And he is sanguine enough to live with the perilous ambiguities of his decisions
  • Though in his Nobel Peace Prize speech in 2009, Obama said, “Inaction tears at our conscience and can lead to more costly intervention later,” today the opinions of humanitarian interventionists do not seem to move him, at least not publicly
  • As he comes to the end of his presidency, Obama believes he has done his country a large favor by keeping it out of the maelstrom—and he believes, I suspect, that historians will one day judge him wise for having done so
  • Inside the West Wing, officials say that Obama, as a president who inherited a financial crisis and two active wars from his predecessor, is keen to leave “a clean barn” to whoever succeeds him. This is why the fight against isis, a group he considers to be a direct, though not existential, threat to the U.S., is his most urgent priority for the remainder of his presidency; killing the so-called caliph of the Islamic State, Abu Bakr al-Baghdadi, is one of the top goals of the American national-security apparatus in Obama’s last year.
  • This is what is so controversial about the president’s approach, and what will be controversial for years to come—the standard he has used to define what, exactly, constitutes a direct threat.
  • Obama has come to a number of dovetailing conclusions about the world, and about America’s role in it. The first is that the Middle East is no longer terribly important to American interests. The second is that even if the Middle East were surpassingly important, there would still be little an American president could do to make it a better place. The third is that the innate American desire to fix the sorts of problems that manifest themselves most drastically in the Middle East inevitably leads to warfare, to the deaths of U.S. soldiers, and to the eventual hemorrhaging of U.S. credibility and power. The fourth is that the world cannot afford to see the diminishment of U.S. power. Just as the leaders of several American allies have found Obama’s leadership inadequate to the tasks before him, he himself has found world leadership wanting: global partners who often lack the vision and the will to spend political capital in pursuit of broad, progressive goals, and adversaries who are not, in his mind, as rational as he is. Obama believes that history has sides, and that America’s adversaries—and some of its putative allies—have situated themselves on the wrong one, a place where tribalism, fundamentalism, sectarianism, and militarism still flourish. What they don’t understand is that history is bending in his direction.
  • “The central argument is that by keeping America from immersing itself in the crises of the Middle East, the foreign-policy establishment believes that the president is precipitating our decline,” Ben Rhodes told me. “But the president himself takes the opposite view, which is that overextension in the Middle East will ultimately harm our economy, harm our ability to look for other opportunities and to deal with other challenges, and, most important, endanger the lives of American service members for reasons that are not in the direct American national-security interest.
  • George W. Bush was also a gambler, not a bluffer. He will be remembered harshly for the things he did in the Middle East. Barack Obama is gambling that he will be judged well for the things he didn’t do.
Javier E

Two Worlds Cracking Up - NYTimes.com - 0 views

  • It turns out that Turkey these days is neither a bridge nor a gully. It’s an island — an island of relative stability between two great geopolitical systems that are cracking apart: the euro zone that came into being after the cold war, and the Arab state system that came into being after World War I are both coming unglued
  • The island of Turkey has become one of the best places to observe both these worlds. To the east, you see the European Monetary Union buckling under the weight of its own hubris — leaders who reached too far in forging a common currency without the common governance to sustain it. And, to the south, you see the Arab League crumbling under the weight of its own decay — leaders who never reached at all for the decent governance and modern education required to thrive in the age of globalization.
  • The Syrians failed to build Syria, the Egyptians failed to build Egypt, the Libyans failed to build Libya, the Yemenis failed to build Yemen. Those are even bigger problems because, as their states have been stressed or fractured, no one knows how they’ll be put back together again.
  • ...3 more annotations...
  • Europeans failed to build Europe, and that is now a big problem because, as its common currency comes under pressure and the E.U. goes deeper into recession, the whole world feels the effects
  • In Europe, the supranational project did not work, and now, to a degree, Europe is falling back into individual states.
  • In the Arab world, the national project did not work, so some of the Arab states are falling back onto sects, tribes, regions and clans.
grayton downing

BBC Sport - Jamaica doping scandals tip of iceberg, says senior drug tester - 0 views

  • Jamaica's most senior drug tester says the country's recent rash of failed tests might be the "tip of an iceberg".
  • Asafa Powell, the former 100m world record holder, was the biggest name to test positive, but four others including Powell's training partner - the Olympic relay gold medallist Sherone Simpson - also failed tests at the country's national trials in June.
  • Wada officials are due to discuss their visit to Jamaica at an executive board meeting in Johannesburg on Tuesday and could make a series of recommendations to improve the country's anti-doping policies.
  • ...9 more annotations...
  • The problem is these people were tested positive in competition. What that means is months before you know the date of the test and the approximate time of the test.
  • So if you fail an in-competition test you haven't only failed a drugs test, you have failed an IQ test.
  • That funding - with the help of additional money from Wada - would be used to hire more senior executives to run the anti-doping programme and to hire and train additional drug testers.
  • Our athletes, as confirmed by the IAAF, were the most tested in the world of athletics, so to say your athletes weren't tested is not exactly true.
  • There is a problem worldwide with the use of supplements," said Fennell. "The whole world is induced to use supplements for one thing or another.
  • Athletes are no different. This is not with a view to cheating and I would put my head on the block and say our athletes do not set out to cheat.
  • We do have rigorous testing. If you look at the record for this year you will see our testing record is amazing. Those of our top athletes are on the registered international programmes.
  • "I understand why people pay more attention to Jamaica," said Carter, who won an individual bronze medal in the 100m in Moscow in August to add to his sprint relay gold from the London Olympics.
  • "It was the same when the US dominated. People said they were on drugs and should be tested. That's a part of the sport and we have to accept that. It's going to hurt fans and athletes because no-one wants to be associated with what's going on.
Javier E

Greece's failed state and Europe's response | openDemocracy - 0 views

  • the problem with Greece is much more profound than most politicians and analysts have so far calculated. In stark reality, and far more dangerously than its present financial crisis threatening the euro project, Greece looks like a failed EU state, which, as such, puts at risk the stability of the entire European project. This is why a European political, rather than simply economic, plan for rescuing Greece must become a most urgent priority.
  • failed states, according to the definition provided by the Crisis States Research Centre of the London School of Economics, can no longer reproduce the conditions of their own existence and, therefore, are under threat of imminent collapse.
  • Greece’s increasing difficulty in coping with border security as tens of thousands of illegal immigrants and asylum seekers enter each year from Turkey.
  • ...7 more annotations...
  • The civil service, notorious for its inefficiency and endemic corruption, has long ago lost all popular confidence in it. It is also unable to collect tax revenue.
  • The failure of the state is clearly reflected in the fast deterioration of the services it provides in basic goods, such as education, health, sanitation, and transport. Diminished funds, frequent strikes, and low morale in public administration have combined to bring the state virtually to a halt.
  • Nor are authorities able to protect citizens, and their properties, from social disorder, widespread vandalism, and occasional bouts of violence
  • this has caused a breakdown in the rule of law.
  • Turn to the political class and you will find that its authority has vanished into thin air. Prime minister George Papandreou may have shown political courage at times, but has no control over either his party or his government.
  • Greece’s GDP will shrink by 5% this year and will be 2% smaller in 2012; the economy is expected to contract for four successive years. Greece is meanwhile faced with one of the greater human flights in her history. This includes three categories of individual, mostly young people, who should be the most valuable to Greece if Greece is to overcome its present crisis: academics and intellectuals; entrepreneurial middle-class professionals with technical skills and expertise; and long-term immigrants who, in the last two decades, were the backbone of Greece’s development
  • Evidently, the problem with Greece is far deeper and much more serious than this country’s (already deep and serious) financial crisis. It involves nothing less than a complete and drastic overhaul of the Greek state system. It should however be equally obvious that, as Greece is an EU country, Europe cannot but face up to this problem and contribute to its solution, as any alternative route must inevitably lead to the collapse of the entire project of European integration.
Javier E

Americans Are Paying the Price for Trump's Failures - The Atlantic - 0 views

  • don’t take responsibility at all,” said President Donald Trump
  • Those words will probably end up as the epitaph of his presidency
  • Trump now fancies himself a “wartime president.” How is his war going?
  • ...47 more annotations...
  • On the present trajectory, it will kill, by late April, more Americans than Vietnam. Having earlier promised that casualties could be held near zero, Trump now claims he will have done a “very good job” if the toll is held below 200,000 dead.
  • The United States is on trajectory to suffer more sickness, more dying, and more economic harm from this virus than any other comparably developed country.
  • The loss of stockpiled respirators to breakage because the federal government let maintenance contracts lapse in 2018 is Trump’s fault. The failure to store sufficient protective medical gear in the national arsenal is Trump’s fault
  • That states are bidding against other states for equipment, paying many multiples of the precrisis price for ventilators, is Trump’s fault. Air travelers summoned home and forced to stand for hours in dense airport crowds alongside infected people? That was Trump’s fault too
  • Trump failed. He is failing. He will continue to fail. And Americans are paying for his failures.
  • The lying about the coronavirus by hosts on Fox News and conservative talk radio is Trump’s fault: They did it to protect him
  • The false hope of instant cures and nonexistent vaccines is Trump’s fault, because he told those lies to cover up his failure to act in time.
  • The severity of the economic crisis is Trump’s fault; things would have been less bad if he had acted faster instead of sending out his chief economic adviser and his son Eric to assure Americans that the first stock-market dips were buying opportunities.
  • The fact that so many key government jobs were either empty or filled by mediocrities? Trump’s fault. The insertion of Trump’s arrogant and incompetent son-in-law as commander in chief of the national medical supply chain? Trump’s fault.
  • sooner or later, every president must face a supreme test, a test that cannot be evaded by blather and bluff and bullying.
  • Ten weeks of insisting that the coronavirus is a harmless flu that would miraculously go away on its own? Trump’s fault again. The refusal of red-state governors to act promptly, the failure to close Florida and Gulf Coast beaches until late March? That fault is more widely shared, but again, responsibility rests with Trump: He could have stopped it, and he did not.
  • Those lost weeks also put the United States—and thus the world—on the path to an economic collapse steeper than any in recent memory.
  • It’s a good guess that the unemployment rate had reached 13 percent by April 3. It may peak at 20 percent, perhaps even higher, and threatens to stay at Great Depression–like levels at least into 2021, maybe longer.
  • This country—buffered by oceans from the epicenter of the global outbreak, in East Asia; blessed with the most advanced medical technology on Earth; endowed with agencies and personnel devoted to responding to pandemics—could have and should have suffered less than nations nearer to China
  • Through the early weeks of the pandemic, when so much death and suffering could still have been prevented or mitigated, Trump joined passivity to fantasy. In those crucial early days, Trump made two big wagers. He bet that the virus could somehow be prevented from entering the United States by travel restrictions. And he bet that, to the extent that the virus had already entered the United States, it would burn off as the weather warmed.
  • If Trump truly was so trustingly ignorant as late as January 22, the fault was again his own. The Trump administration had cut U.S. public-health staff operating inside China by two-thirds, from 47 in January 2017 to 14 by 2019, an important reason it found itself dependent on less-accurate information from the World Health Organization. In July 2019, the Trump administration defunded the position that embedded an epidemiologist inside China’s own disease-control administration, again obstructing the flow of information to the United States.
  • Yet even if Trump did not know what was happening, other Americans did. On January 27, former Vice President Joe Biden sounded the alarm about a global pandemic in an op-ed in USA Today.
  • Because Trump puts so much emphasis on this point, it’s important to stress that none of this is true. Trump did not close the borders early—in fact, he did not truly close them at all.
  • Trump’s actions did little to stop the spread of the virus. The ban applied only to foreign nationals who had been in China during the previous 14 days, and included 11 categories of exceptions. Since the restrictions took effect, nearly 40,000 passengers have entered the United States from China, subjected to inconsistent screenings, The New York Times reported.
  • At a House hearing on February 5, a few days after the restrictions went into effect, Ron Klain—who led the Obama administration’s efforts against the Ebola outbreak—condemned the Trump policy as a “travel Band-Aid, not a travel ban.”
  • The president’s top priority through February 2020 was to exact retribution from truth-tellers in the impeachment fight.
  • Intentionally or not, Trump’s campaign of payback against his perceived enemies in the impeachment battle sent a warning to public-health officials: Keep your mouth shut
  • Throughout the crisis, the top priority of the president, and of everyone who works for the president, has been the protection of his ego
  • Denial became the unofficial policy of the administration through the month of February, and as a result, that of the administration’s surrogates and propagandists.
  • That same day, Secretary of State Mike Pompeo scolded a House committee for daring to ask him about the coronavirus. “We agreed that I’d come today to talk about Iran, and the first question today is not about Iran.”
  • The president’s lies must not be contradicted. And because the president’s lies change constantly, it’s impossible to predict what might contradict him.
  • During the pandemic, this psychological deformity has mutated into a deadly strategic vulnerability for the United States.
  • For three-quarters of his presidency, Trump has taken credit for the economic expansion that began under President Barack Obama in 2010. That expansion accelerated in 2014, just in time to deliver real prosperity over the past three years
  • The harm done by Trump’s own initiatives, and especially his trade wars, was masked by that continued growth.
  • The economy Trump inherited became his all-purpose answer to his critics. Did he break laws, corrupt the Treasury, appoint cronies, and tell lies? So what? Unemployment was down, the stock market up.
  • On February 28, very few Americans had heard of an estimated death toll of 35,000 to 40,000, but Trump had heard it. And his answer to that estimate was: “So far, we have lost nobody.” He conceded, “It doesn’t mean we won’t.” But he returned to his happy talk. “We are totally prepared.” And as always, it was the media's fault. “You hear 35 and 40,000 people and we’ve lost nobody and you wonder, the press is in hysteria mode.”
  • on February 28, it was still not too late to arrange an orderly distribution of medical supplies to the states, not too late to coordinate with U.S. allies, not too late to close the Florida beaches before spring break, not too late to bring passengers home from cruise lines, not too late to ensure that state unemployment-insurance offices were staffed and ready, not too late for local governments to get funds to food banks, not too late to begin social distancing fast and early
  • Stay-at-home orders could have been put into effect on March 1, not in late March and early April.
  • So much time had been wasted by the end of February. So many opportunities had been squandered. But even then, the shock could have been limited. Instead, Trump and his inner circle plunged deeper into two weeks of lies and denial, both about the disease and about the economy.
  • Kudlow repeated his advice that it was a good time to buy stocks on CNBC on March 6 after another bad week for the financial markets. As late as March 9, Trump was still arguing that the coronavirus would be no worse than the seasonal flu.
  • The overwhelmed president responded by doing what comes most naturally to him at moments of trouble: He shifted the blame to others.
  • Trump’s instinct to dodge and blame had devastating consequences for Americans. Every governor and mayor who needed the federal government to take action, every science and medical adviser who hoped to prevent Trump from doing something stupid or crazy, had to reckon with Trump’s psychic needs as their single biggest problem.
  • Governors got the message too. “If they don’t treat you right, I don’t call,” Trump explained at a White House press briefing on March 27. The federal response has been dogged by suspicions of favoritism for political and personal allies of Trump. The District of Columbia has seen its requests denied, while Florida gets everything it asks for.
  • The Trump administration is allocating some supplies through the Federal Emergency Management Agency, but has made the deliberate choice to allow large volumes of crucial supplies to continue to be distributed by commercial firms to their clients. That has left state governments bidding against one another, as if the 1787 Constitution had never been signed, and we have no national government.
  • Around the world, allies are registering that in an emergency, when it matters most, the United States has utterly failed to lead
  • s the pandemic kills, as the economic depression tightens its grip, Donald Trump has consistently put his own needs first. Right now, when his only care should be to beat the pandemic, Trump is renegotiating his debts with his bankers and lease payments with Palm Beach County.
  • He has never tried to be president of the whole United States, but at most 46 percent of it, to the extent that serving even the 46 percent has been consistent with his supreme concerns: stealing, loafing, and whining.
  • Now he is not even serving the 46 percent. The people most victimized by his lies and fantasies are the people who trusted him, the more conservative Americans who harmed themselves to prove their loyalty to Trump.
  • Governments often fail. From Pearl Harbor to the financial crisis of 2008, you can itemize a long list of missed warnings and overlooked dangers that cost lives and inflicted hardship. But in the past, Americans could at least expect public spirit and civic concern from their presidents.
  • Trump has mouthed the slogan “America first,” but he has never acted on it. It has always been “Trump first.” His business first. His excuses first. His pathetic vanity first.
  • rump has taken millions in payments from the Treasury. He has taken millions in payments from U.S. businesses and foreign governments. He has taken millions in payments from the Republican Party and his own inaugural committee. He has taken so much that does not belong to him, that was unethical and even illegal for him to take. But responsibility? No, he will not take that.
  • Yet responsibility falls upon Trump, whether he takes it or not. No matter how much he deflects and insults and snivels and whines, this American catastrophe is on his hands and on his head.
Javier E

Losing Earth: The Decade We Almost Stopped Climate Change - The New York Times - 0 views

  • As Malcolm Forbes Baldwin, the acting chairman of the president’s Council for Environmental Quality, told industry executives in 1981, “There can be no more important or conservative concern than the protection of the globe itself.”
  • Among those who called for urgent, immediate and far-reaching climate policy were Senators John Chafee, Robert Stafford and David Durenberger; the E.P.A. administrator, William K. Reilly; and, during his campaign for president, George H.W. Bush.
  • It was understood that action would have to come immediately. At the start of the 1980s, scientists within the federal government predicted that conclusive evidence of warming would appear on the global temperature record by the end of the decade, at which point it would be too late to avoid disaster.
  • ...180 more annotations...
  • If the world had adopted the proposal widely endorsed at the end of the ’80s — a freezing of carbon emissions, with a reduction of 20 percent by 2005 — warming could have been held to less than 1.5 degrees.
  • Action had to be taken, and the United States would need to lead. It didn’t.
  • There can be no understanding of our current and future predicament without understanding why we failed to solve this problem when we had the chance.
  • The first suggestion to Rafe Pomerance that humankind was destroying the conditions necessary for its own survival came on Page 66 of the government publication EPA-600/7-78-019. It was a technical report about coal
  • ‘This Is the Whole Banana’ Spring 1979
  • here was an urgent problem that demanded their attention, MacDonald believed, because human civilization faced an existential crisis. In “How to Wreck the Environment,” a 1968 essay published while he was a science adviser to Lyndon Johnson, MacDonald predicted a near future in which “nuclear weapons were effectively banned and the weapons of mass destruction were those of environmental catastrophe.” One of the most potentially devastating such weapons, he believed, was the gas that we exhaled with every breath: carbon dioxide. By vastly increasing carbon emissions, the world’s most advanced militaries could alter weather patterns and wreak famine, drought and economic collapse.
  • the Jasons. They were like one of those teams of superheroes with complementary powers that join forces in times of galactic crisis. They had been brought together by federal agencies, including the C.I.A, to devise scientific solutions to national-security problems: how to detect an incoming missile; how to predict fallout from a nuclear bomb; how to develop unconventional weapons, like plague-infested rats.
  • Agle pointed to an article about a prominent geophysicist named Gordon MacDonald, who was conducting a study on climate change with the Jasons, the mysterious coterie of elite scientists to which he belonged
  • During the spring of 1977 and the summer of 1978, the Jasons met to determine what would happen once the concentration of carbon dioxide in the atmosphere doubled from pre-Industrial Revolution levels. It was an arbitrary milestone, the doubling, but a useful one, as its inevitability was not in question; the threshold would most likely be breached by 2035.
  • The Jasons’ report to the Department of Energy, “The Long-Term Impact of Atmospheric Carbon Dioxide on Climate,” was written in an understated tone that only enhanced its nightmarish findings: Global temperatures would increase by an average of two to three degrees Celsius; Dust Bowl conditions would “threaten large areas of North America, Asia and Africa”; access to drinking water and agricultural production would fall, triggering mass migration on an unprecedented scale. “Perhaps the most ominous feature,” however, was the effect of a changing climate on the poles. Even a minimal warming “could lead to rapid melting” of the West Antarctic ice sheet. The ice sheet contained enough water to raise the level of the oceans 16 feet.
  • MacDonald explained that he first studied the carbon-dioxide issue when he was about Pomerance’s age — in 1961, when he served as an adviser to John F. Kennedy. Pomerance pieced together that MacDonald, in his youth, had been something of a prodigy: In his 20s, he advised Dwight D. Eisenhower on space exploration; at 32, he became a member of the National Academy of Sciences; at 40, he was appointed to the inaugural Council on Environmental Quality, where he advised Richard Nixon on the environmental dangers of burning coal. He monitored the carbon-dioxide problem the whole time, with increasing alarm.
  • They were surprised to learn how few senior officials were familiar with the Jasons’ findings, let alone understood the ramifications of global warming. At last, having worked their way up the federal hierarchy, the two went to see the president’s top scientist, Frank Press.
  • Thus began the Gordon and Rafe carbon-dioxide roadshow. Beginning in the spring of 1979, Pomerance arranged informal briefings with the E.P.A., the National Security Council, The New York Times, the Council on Environmental Quality and the Energy Department, which, Pomerance learned, had established an Office of Carbon Dioxide Effects two years earlier at MacDonald’s urging
  • . Out of respect for MacDonald, Press had summoned to their meeting what seemed to be the entire senior staff of the president’s Office of Science and Technology Policy — the officials consulted on every critical matter of energy and national security. What Pomerance had expected to be yet another casual briefing assumed the character of a high-level national-security meeting.
  • MacDonald would begin his presentation by going back more than a century to John Tyndall — an Irish physicist who was an early champion of Charles Darwin’s work and died after being accidentally poisoned by his wife. In 1859, Tyndall found that carbon dioxide absorbed heat and that variations in the composition of the atmosphere could create changes in climate. These findings inspired Svante Arrhenius, a Swedish chemist and future Nobel laureate, to deduce in 1896 that the combustion of coal and petroleum could raise global temperatures. This warming would become noticeable in a few centuries, Arrhenius calculated, or sooner if consumption of fossil fuels continued to increase.
  • Four decades later, a British steam engineer named Guy Stewart Callendar discovered that, at the weather stations he observed, the previous five years were the hottest in recorded history. Humankind, he wrote in a paper, had become “able to speed up the processes of Nature.” That was in 1939.
  • MacDonald’s history concluded with Roger Revelle, perhaps the most distinguished of the priestly caste of government scientists who, since the Manhattan Project, advised every president on major policy; he had been a close colleague of MacDonald and Press since they served together under Kennedy. In a 1957 paper written with Hans Suess, Revelle concluded that “human beings are now carrying out a large-scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future.” Revelle helped the Weather Bureau establish a continuous measurement of atmospheric carbon dioxide at a site perched near the summit of Mauna Loa on the Big Island of Hawaii, 11,500 feet above the sea — a rare pristine natural laboratory on a planet blanketed by fossil-fuel emissions.
  • After nearly a decade of observation, Revelle had shared his concerns with Lyndon Johnson, who included them in a special message to Congress two weeks after his inauguration. Johnson explained that his generation had “altered the composition of the atmosphere on a global scale” through the burning of fossil fuels, and his administration commissioned a study of the subject by his Science Advisory Committee. Revelle was its chairman, and its 1965 executive report on carbon dioxide warned of the rapid melting of Antarctica, rising seas, increased acidity of fresh waters — changes that would require no less than a coordinated global effort to forestall.Yet emissions continued to rise, and at this rate, MacDonald warned, they could see a snowless New England, the swamping of major coastal cities, as much as a 40 percent decline in national wheat production, the forced migration of about one-quarter of the world’s population. Not within centuries — within their own lifetimes.
  • On May 22, Press wrote a letter to the president of the National Academy of Sciences requesting a full assessment of the carbon-dioxide issue. Jule Charney, the father of modern meteorology, would gather the nation’s top oceanographers, atmospheric scientists and climate modelers to judge whether MacDonald’s alarm was justified — whether the world was, in fact, headed to cataclysm.
  • If Charney’s group confirmed that the world was careering toward an existential crisis, the president would be forced to act.
  • Hansen turned from the moon to Venus. Why, he tried to determine, was its surface so hot? In 1967, a Soviet satellite beamed back the answer: The planet’s atmosphere was mainly carbon dioxide. Though once it may have had habitable temperatures, it was believed to have succumbed to a runaway greenhouse effect: As the sun grew brighter, Venus’s ocean began to evaporate, thickening the atmosphere, which forced yet greater evaporation — a self-perpetuating cycle that finally boiled off the ocean entirely and heated the planet’s surface to more than 800 degrees Fahrenheit
  • At the other extreme, Mars’s thin atmosphere had insufficient carbon dioxide to trap much heat at all, leaving it about 900 degrees colder. Earth lay in the middle, its Goldilocks greenhouse effect just strong enough to support life.
  • We want to learn more about Earth’s climate, Jim told Anniek — and how humanity can influence it. He would use giant new supercomputers to map the planet’s atmosphere. They would create Mirror Worlds: parallel realities that mimicked our own. These digital simulacra, technically called “general circulation models,” combined the mathematical formulas that governed the behavior of the sea, land and sky into a single computer model. Unlike the real world, they could be sped forward to reveal the future.
  • The government officials, many of them scientists themselves, tried to suppress their awe of the legends in their presence: Henry Stommel, the world’s leading oceanographer; his protégé, Carl Wunsch, a Jason; the Manhattan Project alumnus Cecil Leith; the Harvard planetary physicist Richard Goody. These were the men who, in the last three decades, had discovered foundational principles underlying the relationships among sun, atmosphere, land and ocean — which is to say, the climate.
  • When, at Charney’s request, Hansen programmed his model to consider a future of doubled carbon dioxide, it predicted a temperature increase of four degrees Celsius. That was twice as much warming as the prediction made by the most prominent climate modeler, Syukuro Manabe, whose government lab at Princeton was the first to model the greenhouse effect. The difference between the two predictions — between warming of two degrees Celsius and four degrees Celsius — was the difference between damaged coral reefs and no reefs whatsoever, between thinning forests and forests enveloped by desert, between catastrophe and chaos.
  • The discrepancy between the models, Arakawa concluded, came down to ice and snow. The whiteness of the world’s snowfields reflected light; if snow melted in a warmer climate, less radiation would escape the atmosphere, leading to even greater warming. Shortly before dawn, Arakawa concluded that Manabe had given too little weight to the influence of melting sea ice, while Hansen had overemphasized it. The best estimate lay in between. Which meant that the Jasons’ calculation was too optimistic. When carbon dioxide doubled in 2035 or thereabouts, global temperatures would increase between 1.5 and 4.5 degrees Celsius, with the most likely outcome a warming of three degrees.
  • within the highest levels of the federal government, the scientific community and the oil-and-gas industry — within the commonwealth of people who had begun to concern themselves with the future habitability of the planet — the Charney report would come to have the authority of settled fact. It was the summation of all the predictions that had come before, and it would withstand the scrutiny of the decades that followed it. Charney’s group had considered everything known about ocean, sun, sea, air and fossil fuels and had distilled it to a single number: three. When the doubling threshold was broached, as appeared inevitable, the world would warm three degrees Celsius
  • The last time the world was three degrees warmer was during the Pliocene, three million years ago, when beech trees grew in Antarctica, the seas were 80 feet higher and horses galloped across the Canadian coast of the Arctic Ocean.
  • After the publication of the Charney report, Exxon decided to create its own dedicated carbon-dioxide research program, with an annual budget of $600,000. Only Exxon was asking a slightly different question than Jule Charney. Exxon didn’t concern itself primarily with how much the world would warm. It wanted to know how much of the warming Exxon could be blamed for.
  • “It behooves us to start a very aggressive defensive program,” Shaw wrote in a memo to a manager, “because there is a good probability that legislation affecting our business will be passed.”
  • Shaw turned to Wallace Broecker, a Columbia University oceanographer who was the second author of Roger Revelle’s 1965 carbon-dioxide report for Lyndon Johnson. In 1977, in a presentation at the American Geophysical Union, Broecker predicted that fossil fuels would have to be restricted, whether by taxation or fiat. More recently, he had testified before Congress, calling carbon dioxide “the No.1 long-term environmental problem.” If presidents and senators trusted Broecker to tell them the bad news, he was good enough for Exxon.
  • The company had been studying the carbon-dioxide problem for decades, since before it changed its name to Exxon. In 1957, scientists from Humble Oil published a study tracking “the enormous quantity of carbon dioxide” contributed to the atmosphere since the Industrial Revolution “from the combustion of fossil fuels.” Even then, the observation that burning fossil fuels had increased the concentration of carbon in the atmosphere was well understood and accepted by Humble’s scientists.
  • The American Petroleum Institute, the industry’s largest trade association, asked the same question in 1958 through its air-pollution study group and replicated the findings made by Humble Oil. So did another A.P.I. study conducted by the Stanford Research Institute a decade later, in 1968, which concluded that the burning of fossil fuels would bring “significant temperature changes” by the year 2000 and ultimately “serious worldwide environmental changes,” including the melting of the Antarctic ice cap and rising seas.
  • The ritual repeated itself every few years. Industry scientists, at the behest of their corporate bosses, reviewed the problem and found good reasons for alarm and better excuses to do nothing. Why should they act when almost nobody within the United States government — nor, for that matter, within the environmental movement — seemed worried?
  • Why take on an intractable problem that would not be detected until this generation of employees was safely retired? Worse, the solutions seemed more punitive than the problem itself. Historically, energy use had correlated to economic growth — the more fossil fuels we burned, the better our lives became. Why mess with that?
  • That June, Jimmy Carter signed the Energy Security Act of 1980, which directed the National Academy of Sciences to start a multiyear, comprehensive study, to be called “Changing Climate,” that would analyze social and economic effects of climate change. More urgent, the National Commission on Air Quality, at the request of Congress, invited two dozen experts, including Henry Shaw himself, to a meeting in Florida to propose climate policy.
  • On April 3, 1980, Senator Paul Tsongas, a Massachusetts Democrat, held the first congressional hearing on carbon-dioxide buildup in the atmosphere. Gordon MacDonald testified that the United States should “take the initiative” and develop, through the United Nations, a way to coordinate every nation’s energy policies to address the problem.
  • During the expansion of the Clean Air Act, he pushed for the creation of the National Commission on Air Quality, charged with ensuring that the goals of the act were being met. One such goal was a stable global climate. The Charney report had made clear that goal was not being met, and now the commission wanted to hear proposals for legislation. It was a profound responsibility, and the two dozen experts invited to the Pink Palace — policy gurus, deep thinkers, an industry scientist and an environmental activist — had only three days to achieve it, but the utopian setting made everything seem possible
  • We have less time than we realize, said an M.I.T. nuclear engineer named David Rose, who studied how civilizations responded to large technological crises. “People leave their problems until the 11th hour, the 59th minute,” he said. “And then: ‘Eloi, Eloi, Lama Sabachthani?’ ” — “My God, my God, why hast thou forsaken me?”
  • The attendees seemed to share a sincere interest in finding solutions. They agreed that some kind of international treaty would ultimately be needed to keep atmospheric carbon dioxide at a safe level. But nobody could agree on what that level was.
  • William Elliott, a NOAA scientist, introduced some hard facts: If the United States stopped burning carbon that year, it would delay the arrival of the doubling threshold by only five years. If Western nations somehow managed to stabilize emissions, it would forestall the inevitable by only eight years. The only way to avoid the worst was to stop burning coal. Yet China, the Soviet Union and the United States, by far the world’s three largest coal producers, were frantically accelerating extraction.
  • “Do we have a problem?” asked Anthony Scoville, a congressional science consultant. “We do, but it is not the atmospheric problem. It is the political problem.” He doubted that any scientific report, no matter how ominous its predictions, would persuade politicians to act.
  • The talk of ending oil production stirred for the first time the gentleman from Exxon. “I think there is a transition period,” Henry Shaw said. “We are not going to stop burning fossil fuels and start looking toward solar or nuclear fusion and so on. We are going to have a very orderly transition from fossil fuels to renewable energy sources.”
  • What if the problem was that they were thinking of it as a problem? “What I am saying,” Scoville continued, “is that in a sense we are making a transition not only in energy but the economy as a whole.” Even if the coal and oil industries collapsed, renewable technologies like solar energy would take their place. Jimmy Carter was planning to invest $80 billion in synthetic fuel. “My God,” Scoville said, “with $80 billion, you could have a photovoltaics industry going that would obviate the need for synfuels forever!”
  • nobody could agree what to do. John Perry, a meteorologist who had worked as a staff member on the Charney report, suggested that American energy policy merely “take into account” the risks of global warming, though he acknowledged that a nonbinding measure might seem “intolerably stodgy.” “It is so weak,” Pomerance said, the air seeping out of him, “as to not get us anywhere.”
  • Scoville pointed out that the United States was responsible for the largest share of global carbon emissions. But not for long. “If we’re going to exercise leadership,” he said, “the opportunity is now.
  • One way to lead, he proposed, would be to classify carbon dioxide as a pollutant under the Clean Air Act and regulate it as such. This was received by the room like a belch. By Scoville’s logic, every sigh was an act of pollution. Did the science really support such an extreme measure? The Charney report did exactly that, Pomerance said.
  • Slade, the director of the Energy Department’s carbon-dioxide program, considered the lag a saving grace. If changes did not occur for a decade or more, he said, those in the room couldn’t be blamed for failing to prevent them. So what was the problem?
  • “Call it whatever.” Besides, Pomerance added, they didn’t have to ban coal tomorrow. A pair of modest steps could be taken immediately to show the world that the United States was serious: the implementation of a carbon tax and increased investment in renewable energy. Then the United States could organize an international summit meeting to address climate change
  • these two dozen experts, who agreed on the major points and had made a commitment to Congress, could not draft a single paragraph. Hours passed in a hell of fruitless negotiation, self-defeating proposals and impulsive speechifying. Pomerance and Scoville pushed to include a statement calling for the United States to “sharply accelerate international dialogue,” but they were sunk by objections and caveats.
  • They never got to policy proposals. They never got to the second paragraph. The final statement was signed by only the moderator, who phrased it more weakly than the declaration calling for the workshop in the first place. “The guide I would suggest,” Jorling wrote, “is whether we know enough not to recommend changes in existing policy.”
  • Pomerance had seen enough. A consensus-based strategy would not work — could not work — without American leadership. And the United States wouldn’t act unless a strong leader persuaded it to do so — someone who would speak with authority about the science, demand action from those in power and risk everything in pursuit of justice.
  • The meeting ended Friday morning. On Tuesday, four days later, Ronald Reagan was elected president.
  • ‘Otherwise, They’ll Gurgle’ November 1980-September 1981
  • In the midst of this carnage, the Council on Environmental Quality submitted a report to the White House warning that fossil fuels could “permanently and disastrously” alter Earth’s atmosphere, leading to “a warming of the Earth, possibly with very serious effects.” Reagan did not act on the council’s advice. Instead, his administration considered eliminating the council.
  • After the election, Reagan considered plans to close the Energy Department, increase coal production on federal land and deregulate surface coal mining. Once in office, he appointed James Watt, the president of a legal firm that fought to open public lands to mining and drilling, to run the Interior Department. “We’re deliriously happy,” the president of the National Coal Association was reported to have said. Reagan preserved the E.P.A. but named as its administrator Anne Gorsuch, an anti-regulation zealot who proceeded to cut the agency’s staff and budget by about a quarter
  • Reagan “has declared open war on solar energy,” the director of the nation’s lead solar-energy research agency said, after he was asked to resign). Reagan appeared determined to reverse the environmental achievements of Jimmy Carter, before undoing those of Richard Nixon, Lyndon Johnson, John F. Kennedy and, if he could get away with it, Theodore Roosevelt.
  • When Reagan considered closing the Council on Environmental Quality, its acting chairman, Malcolm Forbes Baldwin, wrote to the vice president and the White House chief of staff begging them to reconsider; in a major speech the same week, “A Conservative’s Program for the Environment,” Baldwin argued that it was “time for today’s conservatives explicitly to embrace environmentalism.” Environmental protection was not only good sense. It was good business. What could be more conservative than an efficient use of resources that led to fewer federal subsidies?
  • Meanwhile the Charney report continued to vibrate at the periphery of public consciousness. Its conclusions were confirmed by major studies from the Aspen Institute, the International Institute for Applied Systems Analysis near Vienna and the American Association for the Advancement of Science. Every month or so, nationally syndicated articles appeared summoning apocalypse: “Another Warning on ‘Greenhouse Effect,’ ” “Global Warming Trend ‘Beyond Human Experience,’ ” “Warming Trend Could ‘Pit Nation Against Nation.’
  • Pomerance read on the front page of The New York Times on Aug. 22, 1981, about a forthcoming paper in Science by a team of seven NASA scientists. They had found that the world had already warmed in the past century. Temperatures hadn’t increased beyond the range of historical averages, but the scientists predicted that the warming signal would emerge from the noise of routine weather fluctuations much sooner than previously expected. Most unusual of all, the paper ended with a policy recommendation: In the coming decades, the authors wrote, humankind should develop alternative sources of energy and use fossil fuels only “as necessary.” The lead author was James Hansen.
  • Pomerance listened and watched. He understood Hansen’s basic findings well enough: Earth had been warming since 1880, and the warming would reach “almost unprecedented magnitude” in the next century, leading to the familiar suite of terrors, including the flooding of a 10th of New Jersey and a quarter of Louisiana and Florida. But Pomerance was excited to find that Hansen could translate the complexities of atmospheric science into plain English.
  • 7. ‘We’re All Going to Be the Victims’ March 1982
  • Gore had learned about climate change a dozen years earlier as an undergraduate at Harvard, when he took a class taught by Roger Revelle. Humankind was on the brink of radically transforming the global atmosphere, Revelle explained, drawing Keeling’s rising zigzag on the blackboard, and risked bringing about the collapse of civilization. Gore was stunned: Why wasn’t anyone talking about this?
  • Most in Congress considered the science committee a legislative backwater, if they considered it at all; this made Gore’s subcommittee, which had no legislative authority, an afterthought to an afterthought. That, Gore vowed, would change. Environmental and health stories had all the elements of narrative drama: villains, victims and heroes. In a hearing, you could summon all three, with the chairman serving as narrator, chorus and moral authority. He told his staff director that he wanted to hold a hearing every week.
  • The Revelle hearing went as Grumbly had predicted. The urgency of the issue was lost on Gore’s older colleagues, who drifted in and out while the witnesses testified. There were few people left by the time the Brookings Institution economist Lester Lave warned that humankind’s profligate exploitation of fossil fuels posed an existential test to human nature. “Carbon dioxide stands as a symbol now of our willingness to confront the future,” he said. “It will be a sad day when we decide that we just don’t have the time or thoughtfulness to address those issues.”
  • That night, the news programs featured the resolution of the baseball strike, the ongoing budgetary debate and the national surplus of butter.
  • There emerged, despite the general comity, a partisan divide. Unlike the Democrats, the Republicans demanded action. “Today I have a sense of déjà vu,” said Robert Walker, a Republican from Pennsylvania. In each of the last five years, he said, “we have been told and told and told that there is a problem with the increasing carbon dioxide in the atmosphere. We all accept that fact, and we realize that the potential consequences are certainly major in their impact on mankind.” Yet they had failed to propose a single law. “Now is the time,” he said. “The research is clear. It is up to us now to summon the political will.”
  • Hansen flew to Washington to testify on March 25, 1982, performing before a gallery even more thinly populated than at Gore’s first hearing on the greenhouse effect. Gore began by attacking the Reagan administration for cutting funding for carbon-dioxide research despite the “broad consensus in the scientific community that the greenhouse effect is a reality.” William Carney, a Republican from New York, bemoaned the burning of fossil fuels and argued passionately that science should serve as the basis for legislative policy
  • the experts invited by Gore agreed with the Republicans: The science was certain enough. Melvin Calvin, a Berkeley chemist who won the Nobel Prize for his work on the carbon cycle, said that it was useless to wait for stronger evidence of warming. “You cannot do a thing about it when the signals are so big that they come out of the noise,” he said. “You have to look for early warning signs.”
  • Hansen’s job was to share the warning signs, to translate the data into plain English. He explained a few discoveries that his team had made — not with computer models but in libraries. By analyzing records from hundreds of weather stations, he found that the surface temperature of the planet had already increased four-tenths of a degree Celsius in the previous century. Data from several hundred tide-gauge stations showed that the oceans had risen four inches since the 1880s
  • It occurred to Hansen that this was the only political question that mattered: How long until the worst began? It was not a question on which geophysicists expended much effort; the difference between five years and 50 years in the future was meaningless in geologic time. Politicians were capable of thinking only in terms of electoral time: six years, four years, two years. But when it came to the carbon problem, the two time schemes were converging.
  • “Within 10 or 20 years,” Hansen said, “we will see climate changes which are clearly larger than the natural variability.” James Scheuer wanted to make sure he understood this correctly. No one else had predicted that the signal would emerge that quickly. “If it were one or two degrees per century,” he said, “that would be within the range of human adaptability. But we are pushing beyond the range of human adaptability.” “Yes,” Hansen said.
  • How soon, Scheuer asked, would they have to change the national model of energy production? Hansen hesitated — it wasn’t a scientific question. But he couldn’t help himself. He had been irritated, during the hearing, by all the ludicrous talk about the possibility of growing more trees to offset emissions. False hopes were worse than no hope at all: They undermined the prospect of developing real solutions. “That time is very soon,” Hansen said finally. “My opinion is that it is past,” Calvin said, but he was not heard because he spoke from his seat. He was told to speak into the microphone. “It is already later,” Calvin said, “than you think.”
  • From Gore’s perspective, the hearing was an unequivocal success. That night Dan Rather devoted three minutes of “CBS Evening News” to the greenhouse effect. A correspondent explained that temperatures had increased over the previous century, great sheets of pack ice in Antarctica were rapidly melting, the seas were rising; Calvin said that “the trend is all in the direction of an impending catastrophe”; and Gore mocked Reagan for his shortsightedness. Later, Gore could take credit for protecting the Energy Department’s carbon-dioxide program, which in the end was largely preserved.
  • 8. ‘The Direction of an Impending Catastrophe’ 1982
  • Following Henry Shaw’s recommendation to establish credibility ahead of any future legislative battles, Exxon had begun to spend conspicuously on global-warming research. It donated tens of thousands of dollars to some of the most prominent research efforts, including one at Woods Hole led by the ecologist George Woodwell, who had been calling for major climate policy as early as the mid-1970s, and an international effort coordinated by the United Nations. Now Shaw offered to fund the October 1982 symposium on climate change at Columbia’s Lamont-Doherty campus.
  • David boasted that Exxon would usher in a new global energy system to save the planet from the ravages of climate change. He went so far as to argue that capitalism’s blind faith in the wisdom of the free market was “less than satisfying” when it came to the greenhouse effect. Ethical considerations were necessary, too. He pledged that Exxon would revise its corporate strategy to account for climate change, even if it were not “fashionable” to do so. As Exxon had already made heavy investments in nuclear and solar technology, he was “generally upbeat” that Exxon would “invent” a future of renewable energy.
  • Hansen had reason to feel upbeat himself. If the world’s largest oil-and-gas company supported a new national energy model, the White House would not stand in its way. The Reagan administration was hostile to change from within its ranks. But it couldn’t be hostile to Exxon.
  • The carbon-dioxide issue was beginning to receive major national attention — Hansen’s own findings had become front-page news, after all. What started as a scientific story was turning into a political story.
  • The political realm was itself a kind of Mirror World, a parallel reality that crudely mimicked our own. It shared many of our most fundamental laws, like the laws of gravity and inertia and publicity. And if you applied enough pressure, the Mirror World of politics could be sped forward to reveal a new future. Hansen was beginning to understand that too.
  • 1. ‘Caution, Not Panic’ 1983-1984
  • in the fall of 1983, the climate issue entered an especially long, dark winter. And all because of a single report that had done nothing to change the state of climate science but transformed the state of climate politics.
  • After the publication of the Charney report in 1979, Jimmy Carter had directed the National Academy of Sciences to prepare a comprehensive, $1 million analysis of the carbon-dioxide problem: a Warren Commission for the greenhouse effect. A team of scientist-dignitaries — among them Revelle, the Princeton modeler Syukuro Manabe and the Harvard political economist Thomas Schelling, one of the intellectual architects of Cold War game theory — would review the literature, evaluate the consequences of global warming for the world order and propose remedies
  • Then Reagan won the White House.
  • the incipient report served as the Reagan administration’s answer to every question on the subject. There could be no climate policy, Fred Koomanoff and his associates said, until the academy ruled. In the Mirror World of the Reagan administration, the warming problem hadn’t been abandoned at all. A careful, comprehensive solution was being devised. Everyone just had to wait for the academy’s elders to explain what it was.
  • The committee’s chairman, William Nierenberg — a Jason, presidential adviser and director of Scripps, the nation’s pre-eminent oceanographic institution — argued that action had to be taken immediately, before all the details could be known with certainty, or else it would be too late.
  • Better to bet on American ingenuity to save the day. Major interventions in national energy policy, taken immediately, might end up being more expensive, and less effective, than actions taken decades in the future, after more was understood about the economic and social consequences of a warmer planet. Yes, the climate would change, mostly for the worst, but future generations would be better equipped to change with it.
  • Government officials who knew Nierenberg were not surprised by his conclusions: He was an optimist by training and experience, a devout believer in the doctrine of American exceptionalism, one of the elite class of scientists who had helped the nation win a global war, invent the most deadly weapon conceivable and create the booming aerospace and computer industries. America had solved every existential problem it had confronted over the previous generation; it would not be daunted by an excess of carbon dioxide. Nierenberg had also served on Reagan’s transition team. Nobody believed that he had been directly influenced by his political connections, but his views — optimistic about the saving graces of market forces, pessimistic about the value of government regulation — reflected all the ardor of his party.
  • That’s what Nierenberg wrote in “Changing Climate.” But it’s not what he said in the press interviews that followed. He argued the opposite: There was no urgent need for action. The public should not entertain the most “extreme negative speculations” about climate change (despite the fact that many of those speculations appeared in his report). Though “Changing Climate” urged an accelerated transition to renewable fuels, noting that it would take thousands of years for the atmosphere to recover from the damage of the last century, Nierenberg recommended “caution, not panic.” Better to wait and see
  • The damage of “Changing Climate” was squared by the amount of attention it received. Nierenberg’s speech in the Great Hall, being one-500th the length of the actual assessment, received 500 times the press coverage. As The Wall Street Journal put it, in a line echoed by trade journals across the nation: “A panel of top scientists has some advice for people worried about the much-publicized warming of the Earth’s climate: You can cope.”
  • On “CBS Evening News,” Dan Rather said the academy had given “a cold shoulder” to a grim, 200-page E.P.A. assessment published earlier that week (titled “Can We Delay a Greenhouse Warming?”; the E.P.A.’s answer, reduced to a word, was no). The Washington Post described the two reports, taken together, as “clarion calls to inaction.
  • George Keyworth II, Reagan’s science adviser. Keyworth used Nierenberg’s optimism as reason to discount the E.P.A.’s “unwarranted and unnecessarily alarmist” report and warned against taking any “near-term corrective action” on global warming. Just in case it wasn’t clear, Keyworth added, “there are no actions recommended other than continued research.”
  • Edward David Jr., two years removed from boasting of Exxon’s commitment to transforming global energy policy, told Science that the corporation had reconsidered. “Exxon has reverted to being mainly a supplier of conventional hydrocarbon fuels — petroleum products, natural gas and steam coal,” David said. The American Petroleum Institute canceled its own carbon-dioxide research program, too.
  • Exxon soon revised its position on climate-change research. In a presentation at an industry conference, Henry Shaw cited “Changing Climate” as evidence that “the general consensus is that society has sufficient time to technologically adapt to a CO₂ greenhouse effect.” If the academy had concluded that regulations were not a serious option, why should Exxon protest
  • 2. ‘You Scientists Win’ 1985
  • 3. The Size of The Human Imagination Spring-Summer 1986
  • Curtis Moore’s proposal: Use ozone to revive climate. The ozone hole had a solution — an international treaty, already in negotiation. Why not hitch the milk wagon to the bullet train? Pomerance was skeptical. The problems were related, sure: Without a reduction in CFC emissions, you didn’t have a chance of averting cataclysmic global warming. But it had been difficult enough to explain the carbon issue to politicians and journalists; why complicate the sales pitch? Then again, he didn’t see what choice he had. The Republicans controlled the Senate, and Moore was his connection to the Senate’s environmental committee.
  • Pomerance met with Senator John Chafee, a Republican from Rhode Island, and helped persuade him to hold a double-barreled hearing on the twin problems of ozone and carbon dioxide on June 10 and 11, 1986
  • F.Sherwood Rowland, Robert Watson, a NASA scientist, and Richard Benedick, the administration’s lead representative in international ozone negotiations, would discuss ozone; James Hansen, Al Gore, the ecologist George Woodwell and Carl Wunsch, a veteran of the Charney group, would testify about climate change.
  • As Pomerance had hoped, fear about the ozone layer ensured a bounty of press coverage for the climate-change testimony. But as he had feared, it caused many people to conflate the two crises. One was Peter Jennings, who aired the video on ABC’s “World News Tonight,” warning that the ozone hole “could lead to flooding all over the world, also to drought and to famine.”
  • The confusion helped: For the first time since the “Changing Climate” report, global-warming headlines appeared by the dozen. William Nierenberg’s “caution, not panic” line was inverted. It was all panic without a hint of caution: “A Dire Forecast for ‘Greenhouse’ Earth” (the front page of The Washington Post); “Scientists Predict Catastrophes in Growing Global Heat Wave” (Chicago Tribune); “Swifter Warming of Globe Foreseen” (The New York Times).
  • After three years of backsliding and silence, Pomerance was exhilarated to see interest in the issue spike overnight. Not only that: A solution materialized, and a moral argument was passionately articulated — by Rhode Island’s Republican senator no less. “Ozone depletion and the greenhouse effect can no longer be treated solely as important scientific questions,” Chafee said. “They must be seen as critical problems facing the nations of the world, and they are problems that demand solutions.”
  • The old canard about the need for more research was roundly mocked — by Woodwell, by a W.R.I. colleague named Andrew Maguire, by Senator George Mitchell, a Democrat from Maine. “Scientists are never 100 percent certain,” the Princeton historian Theodore Rabb testified. “That notion of total certainty is something too elusive ever to be sought.” As Pomerance had been saying since 1979, it was past time to act. Only now the argument was so broadly accepted that nobody dared object.
  • The ozone hole, Pomerance realized, had moved the public because, though it was no more visible than global warming, people could be made to see it. They could watch it grow on video. Its metaphors were emotionally wrought: Instead of summoning a glass building that sheltered plants from chilly weather (“Everything seems to flourish in there”), the hole evoked a violent rending of the firmament, inviting deathly radiation. Americans felt that their lives were in danger. An abstract, atmospheric problem had been reduced to the size of the human imagination. It had been made just small enough, and just large enough, to break through.
  • Four years after “Changing Climate,” two years after a hole had torn open the firmament and a month after the United States and more than three dozen other nations signed a treaty to limit use of CFCs, the climate-change corps was ready to celebrate. It had become conventional wisdom that climate change would follow ozone’s trajectory. Reagan’s E.P.A. administrator, Lee M. Thomas, said as much the day he signed the Montreal Protocol on Substances That Deplete the Ozone Layer (the successor to the Vienna Convention), telling reporters that global warming was likely to be the subject of a future international agreement
  • Congress had already begun to consider policy — in 1987 alone, there were eight days of climate hearings, in three committees, across both chambers of Congress; Senator Joe Biden, a Delaware Democrat, had introduced legislation to establish a national climate-change strategy. And so it was that Jim Hansen found himself on Oct. 27 in the not especially distinguished ballroom of the Quality Inn on New Jersey Avenue, a block from the Capitol, at “Preparing for Climate Change,” which was technically a conference but felt more like a wedding.
  • John Topping was an old-line Rockefeller Republican, a Commerce Department lawyer under Nixon and an E.P.A. official under Reagan. He first heard about the climate problem in the halls of the E.P.A. in 1982 and sought out Hansen, who gave him a personal tutorial. Topping was amazed to discover that out of the E.P.A.’s 13,000-person staff, only seven people, by his count, were assigned to work on climate, though he figured it was more important to the long-term security of the nation than every other environmental issue combined.
  • Glancing around the room, Jim Hansen could chart, like an arborist counting rings on a stump, the growth of the climate issue over the decade. Veterans like Gordon MacDonald, George Woodwell and the environmental biologist Stephen Schneider stood at the center of things. Former and current staff members from the congressional science committees (Tom Grumbly, Curtis Moore, Anthony Scoville) made introductions to the congressmen they advised. Hansen’s owlish nemesis Fred Koomanoff was present, as were his counterparts from the Soviet Union and Western Europe. Rafe Pomerance’s cranium could be seen above the crowd, but unusually he was surrounded by colleagues from other environmental organizations that until now had shown little interest in a diffuse problem with no proven fund-raising record. The party’s most conspicuous newcomers, however, the outermost ring, were the oil-and-gas executives.
  • That evening, as a storm spat and coughed outside, Rafe Pomerance gave one of his exhortative speeches urging cooperation among the various factions, and John Chafee and Roger Revelle received awards; introductions were made and business cards earnestly exchanged. Not even a presentation by Hansen of his research could sour the mood. The next night, on Oct. 28, at a high-spirited dinner party in Topping’s townhouse on Capitol Hill, the oil-and-gas men joked with the environmentalists, the trade-group representatives chatted up the regulators and the academics got merrily drunk. Mikhail Budyko, the don of the Soviet climatologists, settled into an extended conversation about global warming with Topping’s 10-year-old son. It all seemed like the start of a grand bargain, a uniting of factions — a solution.
  • Hansen was accustomed to the bureaucratic nuisances that attended testifying before Congress; before a hearing, he had to send his formal statement to NASA headquarters, which forwarded it to the White House’s Office of Management and Budget for approval. “Major greenhouse climate changes are a certainty,” he had written. “By the 2010s [in every scenario], essentially the entire globe has very substantial warming.”
  • By all appearances, plans for major policy continued to advance rapidly. After the Johnston hearing, Timothy Wirth, a freshman Democratic senator from Colorado on the energy committee, began to plan a comprehensive package of climate-change legislation — a New Deal for global warming. Wirth asked a legislative assistant, David Harwood, to consult with experts on the issue, beginning with Rafe Pomerance, in the hope of converting the science of climate change into a new national energy policy.
  • In March 1988, Wirth joined 41 other senators, nearly half of them Republicans, to demand that Reagan call for an international treaty modeled after the ozone agreement. Because the United States and the Soviet Union were the world’s two largest contributors of carbon emissions, responsible for about one-third of the world total, they should lead the negotiations. Reagan agreed. In May, he signed a joint statement with Mikhail Gorbachev that included a pledge to cooperate on global warming.
  • Al Gore himself had, for the moment, withdrawn his political claim to the issue. In 1987, at the age of 39, Gore announced that he was running for president, in part to bring attention to global warming, but he stopped emphasizing it after the subject failed to captivate New Hampshire primary voters.
  • 5. ‘You Will See Things That You Shall Believe’ Summer 1988
  • It was the hottest and driest summer in history. Everywhere you looked, something was bursting into flames. Two million acres in Alaska incinerated, and dozens of major fires scored the West. Yellowstone National Park lost nearly one million acres. Smoke was visible from Chicago, 1,600 miles away.
  • In Nebraska, suffering its worst drought since the Dust Bowl, there were days when every weather station registered temperatures above 100 degrees. The director of the Kansas Department of Health and Environment warned that the drought might be the dawning of a climatic change that within a half century could turn the state into a desert.
  • On June 22 in Washington, where it hit 100 degrees, Rafe Pomerance received a call from Jim Hansen, who was scheduled to testify the following morning at a Senate hearing called by Timothy Wirth. “I hope we have good media coverage tomorrow,” Hansen said.
  • Hansen had just received the most recent global temperature data. Just over halfway into the year, 1988 was setting records. Already it had nearly clinched the hottest year in history. Ahead of schedule, the signal was emerging from the noise. “I’m going to make a pretty strong statement,” Hansen said.
  • Hansen returned to his testimony. He wrote: “The global warming is now large enough that we can ascribe with a high degree of confidence a cause-and-effect relationship to the greenhouse effect.” He wrote: “1988 so far is so much warmer than 1987, that barring a remarkable and improbable cooling, 1988 will be the warmest year on record.” He wrote: “The greenhouse effect has been detected, and it is changing our climate now.”
  • “We have only one planet,” Senator Bennett Johnston intoned. “If we screw it up, we have no place to go.” Senator Max Baucus, a Democrat from Montana, called for the United Nations Environment Program to begin preparing a global remedy to the carbon-dioxide problem. Senator Dale Bumpers, a Democrat of Arkansas, previewed Hansen’s testimony, saying that it “ought to be cause for headlines in every newspaper in America tomorrow morning.” The coverage, Bumpers emphasized, was a necessary precursor to policy. “Nobody wants to take on any of the industries that produce the things that we throw up into the atmosphere,” he said. “But what you have are all these competing interests pitted against our very survival.”
  • Hansen, wiping his brow, spoke without affect, his eyes rarely rising from his notes. The warming trend could be detected “with 99 percent confidence,” he said. “It is changing our climate now.” But he saved his strongest comment for after the hearing, when he was encircled in the hallway by reporters. “It is time to stop waffling so much,” he said, “and say that the evidence is pretty strong that the greenhouse effect is here.”
  • The press followed Bumpers’s advice. Hansen’s testimony prompted headlines in dozens of newspapers across the country, including The New York Times, which announced, across the top of its front page: “Global Warming Has Begun, Expert Tells Senate.”
  • Rafe Pomerance called his allies on Capitol Hill, the young staff members who advised politicians, organized hearings, wrote legislation. We need to finalize a number, he told them, a specific target, in order to move the issue — to turn all this publicity into policy. The Montreal Protocol had called for a 50 percent reduction in CFC emissions by 1998. What was the right target for carbon emissions? It wasn’t enough to exhort nations to do better. That kind of talk might sound noble, but it didn’t change investments or laws. They needed a hard goal — something ambitious but reasonable. And they needed it soon: Just four days after Hansen’s star turn, politicians from 46 nations and more than 300 scientists would convene in Toronto at the World Conference on the Changing Atmosphere, an event described by Philip Shabecoff of The New York Times as “Woodstock for climate change.”
  • Pomerance had a proposal: a 20 percent reduction in carbon emissions by 2000. Ambitious, Harwood said. In all his work planning climate policy, he had seen no assurance that such a steep drop in emissions was possible. Then again, 2000 was more than a decade off, so it allowed for some flexibility.
  • Mintzer pointed out that a 20 percent reduction was consistent with the academic literature on energy efficiency. Various studies over the years had shown that you could improve efficiency in most energy systems by roughly 20 percent if you adopted best practices.
  • Of course, with any target, you had to take into account the fact that the developing world would inevitably consume much larger quantities of fossil fuels by 2000. But those gains could be offset by a wider propagation of the renewable technologies already at hand — solar, wind, geothermal. It was not a rigorous scientific analysis, Mintzer granted, but 20 percent sounded plausible. We wouldn’t need to solve cold fusion or ask Congress to repeal the law of gravity. We could manage it with the knowledge and technology we already had.
  • Besides, Pomerance said, 20 by 2000 sounds good.
  • The conference’s final statement, signed by all 400 scientists and politicians in attendance, repeated the demand with a slight variation: a 20 percent reduction in carbon emissions by 2005. Just like that, Pomerance’s best guess became global diplomatic policy.
  • Hansen, emerging from Anniek’s successful cancer surgery, took it upon himself to start a one-man public information campaign. He gave news conferences and was quoted in seemingly every article about the issue; he even appeared on television with homemade props. Like an entrant at an elementary-school science fair, he made “loaded dice” out of sections of cardboard and colored paper to illustrate the increased likelihood of hotter weather in a warmer climate. Public awareness of the greenhouse effect reached a new high of 68 percent
  • global warming became a major subject of the presidential campaign. While Michael Dukakis proposed tax incentives to encourage domestic oil production and boasted that coal could satisfy the nation’s energy needs for the next three centuries, George Bush took advantage. “I am an environmentalist,” he declared on the shore of Lake Erie, the first stop on a five-state environmental tour that would take him to Boston Harbor, Dukakis’s home turf. “Those who think we are powerless to do anything about the greenhouse effect,” he said, “are forgetting about the White House effect.”
  • His running mate emphasized the ticket’s commitment to the issue at the vice-presidential debate. “The greenhouse effect is an important environmental issue,” Dan Quayle said. “We need to get on with it. And in a George Bush administration, you can bet that we will.”
  • This kind of talk roused the oil-and-gas men. “A lot of people on the Hill see the greenhouse effect as the issue of the 1990s,” a gas lobbyist told Oil & Gas Journal. Before a meeting of oil executives shortly after the “environmentalist” candidate won the election, Representative Dick Cheney, a Wyoming Republican, warned, “It’s going to be very difficult to fend off some kind of gasoline tax.” The coal industry, which had the most to lose from restrictions on carbon emissions, had moved beyond denial to resignation. A spokesman for the National Coal Association acknowledged that the greenhouse effect was no longer “an emerging issue. It is here already, and we’ll be hearing more and more about it.”
  • By the end of the year, 32 climate bills had been introduced in Congress, led by Wirth’s omnibus National Energy Policy Act of 1988. Co-sponsored by 13 Democrats and five Republicans, it established as a national goal an “International Global Agreement on the Atmosphere by 1992,” ordered the Energy Department to submit to Congress a plan to reduce energy use by at least 2 percent a year through 2005 and directed the Congressional Budget Office to calculate the feasibility of a carbon tax. A lawyer for the Senate energy committee told an industry journal that lawmakers were “frightened” by the issue and predicted that Congress would eventually pass significant legislation after Bush took office
  • The other great powers refused to wait. The German Parliament created a special commission on climate change, which concluded that action had to be taken immediately, “irrespective of any need for further research,” and that the Toronto goal was inadequate; it recommended a 30 percent reduction of carbon emissions
  • Margaret Thatcher, who had studied chemistry at Oxford, warned in a speech to the Royal Society that global warming could “greatly exceed the capacity of our natural habitat to cope” and that “the health of the economy and the health of our environment are totally dependent upon each other.”
  • The prime ministers of Canada and Norway called for a binding international treaty on the atmosphere; Sweden’s Parliament went further, announcing a national strategy to stabilize emissions at the 1988 level and eventually imposing a carbon tax
  • the United Nations unanimously endorsed the establishment, by the World Meteorological Organization and the United Nations Environment Program, of an Intergovernmental Panel on Climate Change, composed of scientists and policymakers, to conduct scientific assessments and develop global climate policy.
  • One of the I.P.C.C.’s first sessions to plan an international treaty was hosted by the State Department, 10 days after Bush’s inauguration. James Baker chose the occasion to make his first speech as secretary of state. “We can probably not afford to wait until all of the uncertainties about global climate change have been resolved,” he said. “Time will not make the problem go away.”
  • : On April 14, 1989, a bipartisan group of 24 senators, led by the majority leader, George Mitchell, requested that Bush cut emissions in the United States even before the I.P.C.C.’s working group made its recommendation. “We cannot afford the long lead times associated with a comprehensive global agreement,” the senators wrote. Bush had promised to combat the greenhouse effect with the White House effect. The self-proclaimed environmentalist was now seated in the Oval Office. It was time.
  • 8. ‘You Never Beat The White House’ April 1989
  • After Jim Baker gave his boisterous address to the I.P.C.C. working group at the State Department, he received a visit from John Sununu, Bush’s chief of staff. Leave the science to the scientists, Sununu told Baker. Stay clear of this greenhouse-effect nonsense. You don’t know what you’re talking about. Baker, who had served as Reagan’s chief of staff, didn’t speak about the subject again.
  • despite his reputation as a political wolf, he still thought of himself as a scientist — an “old engineer,” as he was fond of putting it, having earned a Ph.D. in mechanical engineering from M.I.T. decades earlier. He lacked the reflexive deference that so many of his political generation reserved for the class of elite government scientists.
  • Since World War II, he believed, conspiratorial forces had used the imprimatur of scientific knowledge to advance an “anti-growth” doctrine. He reserved particular disdain for Paul Ehrlich’s “The Population Bomb,” which prophesied that hundreds of millions of people would starve to death if the world took no step to curb population growth; the Club of Rome, an organization of European scientists, heads of state and economists, which similarly warned that the world would run out of natural resources; and as recently as the mid-’70s, the hypothesis advanced by some of the nation’s most celebrated scientists — including Carl Sagan, Stephen Schneider and Ichtiaque Rasool — that a new ice age was dawning, thanks to the proliferation of man-made aerosols. All were theories of questionable scientific merit, portending vast, authoritarian remedies to halt economic progress.
  • When Mead talked about “far-reaching” decisions and “long-term consequences,” Sununu heard the marching of jackboots.
  • Sununu had suspected that the greenhouse effect belonged to this nefarious cabal since 1975, when the anthropologist Margaret Mead convened a symposium on the subject at the National Institute of Environmental Health Sciences.
  • While Sununu and Darman reviewed Hansen’s statements, the E.P.A. administrator, William K. Reilly, took a new proposal to the White House. The next meeting of the I.P.C.C.’s working group was scheduled for Geneva the following month, in May; it was the perfect occasion, Reilly argued, to take a stronger stand on climate change. Bush should demand a global treaty to reduce carbon emissions.
  • Sununu wouldn’t budge. He ordered the American delegates not to make any commitment in Geneva. Very soon after that, someone leaked the exchange to the press.
  • A deputy of Jim Baker pulled Reilly aside. He said he had a message from Baker, who had observed Reilly’s infighting with Sununu. “In the long run,” the deputy warned Reilly, “you never beat the White House.”
  • 9. ‘A Form of Science Fraud’ May 1989
  • The cameras followed Hansen and Gore into the marbled hallway. Hansen insisted that he wanted to focus on the science. Gore focused on the politics. “I think they’re scared of the truth,” he said. “They’re scared that Hansen and the other scientists are right and that some dramatic policy changes are going to be needed, and they don’t want to face up to it.”
  • The censorship did more to publicize Hansen’s testimony and the dangers of global warming than anything he could have possibly said. At the White House briefing later that morning, Press Secretary Marlin Fitzwater admitted that Hansen’s statement had been changed. He blamed an official “five levels down from the top” and promised that there would be no retaliation. Hansen, he added, was “an outstanding and distinguished scientist” and was “doing a great job.”
  • 10. The White House Effect Fall 1989
  • The Los Angeles Times called the censorship “an outrageous assault.” The Chicago Tribune said it was the beginning of “a cold war on global warming,” and The New York Times warned that the White House’s “heavy-handed intervention sends the signal that Washington wants to go slow on addressing the greenhouse problem.”
  • Darman went to see Sununu. He didn’t like being accused of censoring scientists. They needed to issue some kind of response. Sununu called Reilly to ask if he had any ideas. We could start, Reilly said, by recommitting to a global climate treaty. The United States was the only Western nation on record as opposing negotiations.
  • Sununu sent a telegram to Geneva endorsing a plan “to develop full international consensus on necessary steps to prepare for a formal treaty-negotiating process. The scope and importance of this issue are so great that it is essential for the U.S. to exercise leadership.”
  • Sununu seethed at any mention of the subject. He had taken it upon himself to study more deeply the greenhouse effect; he would have a rudimentary, one-dimensional general circulation model installed on his personal desktop computer. He decided that the models promoted by Jim Hansen were a lot of bunk. They were horribly imprecise in scale and underestimated the ocean’s ability to mitigate warming. Sununu complained about Hansen to D. Allan Bromley, a nuclear physicist from Yale who, at Sununu’s recommendation, was named Bush’s science adviser. Hansen’s findings were “technical poppycock” that didn’t begin to justify such wild-eyed pronouncements that “the greenhouse effect is here” or that the 1988 heat waves could be attributed to global warming, let alone serve as the basis for national economic policy.
  • When a junior staff member in the Energy Department, in a meeting at the White House with Sununu and Reilly, mentioned an initiative to reduce fossil-fuel use, Sununu interrupted her. “Why in the world would you need to reduce fossil-fuel use?” he asked. “Because of climate change,” the young woman replied. “I don’t want anyone in this administration without a scientific background using ‘climate change’ or ‘global warming’ ever again,” he said. “If you don’t have a technical basis for policy, don’t run around making decisions on the basis of newspaper headlines.” After the meeting, Reilly caught up to the staff member in the hallway. She was shaken. Don’t take it personally, Reilly told her. Sununu might have been looking at you, but that was directed at me.
  • Reilly, for his part, didn’t entirely blame Sununu for Bush’s indecision on the prospect of a climate treaty. The president had never taken a vigorous interest in global warming and was mainly briefed about it by nonscientists. Bush had brought up the subject on the campaign trail, in his speech about the White House effect, after leafing through a briefing booklet for a new issue that might generate some positive press. When Reilly tried in person to persuade him to take action, Bush deferred to Sununu and Baker. Why don’t the three of you work it out, he said. Let me know when you decide
  • Relations between Sununu and Reilly became openly adversarial. Reilly, Sununu thought, was a creature of the environmental lobby. He was trying to impress his friends at the E.P.A. without having a basic grasp of the science himself.
  • Pomerance had the sinking feeling that the momentum of the previous year was beginning to flag. The censoring of Hansen’s testimony and the inexplicably strident opposition from John Sununu were ominous signs. So were the findings of a report Pomerance had commissioned, published in September by the World Resources Institute, tracking global greenhouse-gas emissions. The United States was the largest contributor by far, producing nearly a quarter of the world’s carbon emissions, and its contribution was growing faster than that of every other country. Bush’s indecision, or perhaps inattention, had already managed to delay the negotiation of a global climate treaty until 1990 at the earliest, perhaps even 1991. By then, Pomerance worried, it would be too late.
  • Pomerance tried to be more diplomatic. “The president made a commitment to the American people to deal with global warming,” he told The Washington Post, “and he hasn’t followed it up.” He didn’t want to sound defeated. “There are some good building blocks here,” Pomerance said, and he meant it. The Montreal Protocol on CFCs wasn’t perfect at first, either — it had huge loopholes and weak restrictions. Once in place, however, the restrictions could be tightened. Perhaps the same could happen with climate change. Perhaps. Pomerance was not one for pessimism. As William Reilly told reporters, dutifully defending the official position forced upon him, it was the first time that the United States had formally endorsed the concept of an emissions limit. Pomerance wanted to believe that this was progress.
  • All week in Noordwijk, Becker couldn’t stop talking about what he had seen in Zeeland. After a flood in 1953, when the sea swallowed much of the region, killing more than 2,000 people, the Dutch began to build the Delta Works, a vast concrete-and-steel fortress of movable barriers, dams and sluice gates — a masterpiece of human engineering. The whole system could be locked into place within 90 minutes, defending the land against storm surge. It reduced the country’s exposure to the sea by 700 kilometers, Becker explained. The United States coastline was about 153,000 kilometers long. How long, he asked, was the entire terrestrial coastline? Because the whole world was going to need this. In Zeeland, he said, he had seen the future.
  • Ken Caldeira, a climate scientist at the Carnegie Institution for Science in Stanford, Calif., has a habit of asking new graduate students to name the largest fundamental breakthrough in climate physics since 1979. It’s a trick question. There has been no breakthrough. As with any mature scientific discipline, there is only refinement. The computer models grow more precise; the regional analyses sharpen; estimates solidify into observational data. Where there have been inaccuracies, they have tended to be in the direction of understatement.
  • More carbon has been released into the atmosphere since the final day of the Noordwijk conference, Nov. 7, 1989, than in the entire history of civilization preceding it
  • Despite every action taken since the Charney report — the billions of dollars invested in research, the nonbinding treaties, the investments in renewable energy — the only number that counts, the total quantity of global greenhouse gas emitted per year, has continued its inexorable rise.
  • When it comes to our own nation, which has failed to make any binding commitments whatsoever, the dominant narrative for the last quarter century has concerned the efforts of the fossil-fuel industries to suppress science, confuse public knowledge and bribe politicians.
  • The mustache-twirling depravity of these campaigns has left the impression that the oil-and-gas industry always operated thus; while the Exxon scientists and American Petroleum Institute clerics of the ’70s and ’80s were hardly good Samaritans, they did not start multimillion-dollar disinformation campaigns, pay scientists to distort the truth or try to brainwash children in elementary schools, as their successors would.
  • It was James Hansen’s testimony before Congress in 1988 that, for the first time since the “Changing Climate” report, made oil-and-gas executives begin to consider the issue’s potential to hurt their profits. Exxon, as ever, led the field. Six weeks after Hansen’s testimony, Exxon’s manager of science and strategy development, Duane LeVine, prepared an internal strategy paper urging the company to “emphasize the uncertainty in scientific conclusions.” This shortly became the default position of the entire sector. LeVine, it so happened, served as chairman of the global petroleum industry’s Working Group on Global Climate Change, created the same year, which adopted Exxon’s position as its own
  • The American Petroleum Institute, after holding a series of internal briefings on the subject in the fall and winter of 1988, including one for the chief executives of the dozen or so largest oil companies, took a similar, if slightly more diplomatic, line. It set aside money for carbon-dioxide policy — about $100,000, a fraction of the millions it was spending on the health effects of benzene, but enough to establish a lobbying organization called, in an admirable flourish of newspeak, the Global Climate Coalition.
  • The G.C.C. was conceived as a reactive body, to share news of any proposed regulations, but on a whim, it added a press campaign, to be coordinated mainly by the A.P.I. It gave briefings to politicians known to be friendly to the industry and approached scientists who professed skepticism about global warming. The A.P.I.’s payment for an original op-ed was $2,000.
  • It was joined by the U.S. Chamber of Commerce and 14 other trade associations, including those representing the coal, electric-grid and automobile industries
  • In October 1989, scientists allied with the G.C.C. began to be quoted in national publications, giving an issue that lacked controversy a convenient fulcrum. “Many respected scientists say the available evidence doesn’t warrant the doomsday warnings,” was the caveat that began to appear in articles on climate change.
  • The following year, when President Bill Clinton proposed an energy tax in the hope of meeting the goals of the Rio treaty, the A.P.I. invested $1.8 million in a G.C.C. disinformation campaign. Senate Democrats from oil-and-coal states joined Republicans to defeat the tax proposal, which later contributed to the Republicans’ rout of Democrats in the midterm congressional elections in 1994 — the first time the Republican Party had won control of both houses in 40 years
  • The G.C.C. spent $13 million on a single ad campaign intended to weaken support for the 1997 Kyoto Protocol, which committed its parties to reducing greenhouse-gas emissions by 5 percent relative to 1990 levels. The Senate, which would have had to ratify the agreement, took a pre-emptive vote declaring its opposition; the resolution passed 95-0. There has never been another serious effort to negotiate a binding global climate treaty.
  • . This has made the corporation an especially vulnerable target for the wave of compensatory litigation that began in earnest in the last three years and may last a generation. Tort lawsuits have become possible only in recent years, as scientists have begun more precisely to attribute regional effects to global emission levels. This is one subfield of climate science that has advanced significantly sin
  • Pomerance had not been among the 400 delegates invited to Noordwijk. But together with three young activists — Daniel Becker of the Sierra Club, Alden Meyer of the Union of Concerned Scientists and Stewart Boyle from Friends of the Earth — he had formed his own impromptu delegation. Their constituency, they liked to say, was the climate itself. Their mission was to pressure the delegates to include in the final conference statement, which would be used as the basis for a global treaty, the target proposed in Toronto: a 20 percent reduction of greenhouse-gas combustion by 2005. It was the only measure that mattered, the amount of emissions reductions, and the Toronto number was the strongest global target yet proposed.
  • The delegations would review the progress made by the I.P.C.C. and decide whether to endorse a framework for a global treaty. There was a general sense among the delegates that they would, at minimum, agree to the target proposed by the host, the Dutch environmental minister, more modest than the Toronto number: a freezing of greenhouse-gas emissions at 1990 levels by 2000. Some believed that if the meeting was a success, it would encourage the I.P.C.C. to accelerate its negotiations and reach a decision about a treaty sooner. But at the very least, the world’s environmental ministers should sign a statement endorsing a hard, binding target of emissions reductions. The mood among the delegates was electric, nearly giddy — after more than a decade of fruitless international meetings, they could finally sign an agreement that meant something.
  • 11. ‘The Skunks at The Garden Party’ November 1989
  • It was nearly freezing — Nov. 6, 1989, on the coast of the North Sea in the Dutch resort town of Noordwijk
  • Losing Earth: The Decade WeAlmost Stopped Climate Change We knew everything we needed to know, and nothing stood in our way. Nothing, that is, except ourselves. A tragedy in two acts. By Nathaniel RichPhotographs and Videos by George Steinmetz AUG. 1, 2018
Javier E

David Stockman: Mitt Romney and the Bain Drain - Newsweek and The Daily Beast - 1 views

  • Is Romney really a job creator? Ronald Reagan’s budget director, David Stockman, takes a scalpel to the claims.
  • Bain Capital is a product of the Great Deformation. It has garnered fabulous winnings through leveraged speculation in financial markets that have been perverted and deformed by decades of money printing and Wall Street coddling by the Fed. So Bain’s billions of profits were not rewards for capitalist creation; they were mainly windfalls collected from gambling in markets that were rigged to rise.
  • Mitt Romney claims that his essential qualification to be president is grounded in his 15 years as head of Bain Capital, from 1984 through early 1999. According to the campaign’s narrative, it was then that he became immersed in the toils of business enterprise, learning along the way the true secrets of how to grow the economy and create jobs. The fact that Bain’s returns reputedly averaged more than 50 percent annually during this period is purportedly proof of the case
  • ...19 more annotations...
  • Except Mitt Romney was not a businessman; he was a master financial speculator who bought, sold, flipped, and stripped businesses. He did not build enterprises the old-fashioned way—out of inspiration, perspiration, and a long slog in the free market fostering a new product, service, or process of production. Instead, he spent his 15 years raising debt in prodigious amounts on Wall Street so that Bain could purchase the pots and pans and castoffs of corporate America, leverage them to the hilt, gussy them up as reborn “roll-ups,” and then deliver them back to Wall Street for resale—the faster the better.
  • That is the modus operandi of the leveraged-buyout business, and in an honest free-market economy, there wouldn’t be much scope for it because it creates little of economic value. But we have a rigged system—a regime of crony capitalism—where the tax code heavily favors debt and capital gains, and the central bank purposefully enables rampant speculation by propping up the price of financial assets and battering down the cost of leveraged finance.
  • So the vast outpouring of LBOs in recent decades has been the consequence of bad policy, not the product of capitalist enterprise. I know this from 17 years of experience doing leveraged buyouts at one of the pioneering private-equity houses, Blackstone, and then my own firm. I know the pitfalls of private equity. The whole business was about maximizing debt, extracting cash, cutting head counts, skimping on capital spending, outsourcing production, and dressing up the deal for the earliest, highest-profit exit possible. Occasionally, we did invest in genuine growth companies, but without cheap debt and deep tax subsidies, most deals would not make economic sense.
  • In truth, LBOs are capitalism’s natural undertakers—vulture investors who feed on failing businesses. Due to bad policy, however, they have now become monsters of the financial midway that strip-mine cash from healthy businesses and recycle it mostly to the top 1 percent.
  • Accordingly, Bain’s returns on the overwhelming bulk of the deals—67 out of 77—were actually lower than what a passive S&P 500 indexer would have earned even without the risk of leverage or paying all the private-equity fees. Investor profits amounted to a prosaic 0.7X the original investment on these deals and, based on its average five-year holding period, the annual return would have computed to about 12 percent—well below the 17 percent average return on the S&P in this period.
  • having a trader’s facility for knowing when to hold ’em and when to fold ’em has virtually nothing to do with rectifying the massive fiscal hemorrhage and debt-burdened private economy that are the real issues before the American electorate
  • Indeed, the next president’s overriding task is restoring national solvency—an undertaking that will involve immense societywide pain, sacrifice, and denial and that will therefore require “fairness” as a defining principle. And that’s why heralding Romney’s record at Bain is so completely perverse. The record is actually all about the utter unfairness of windfall riches obtained under our anti-free market regime of bubble finance.
  • When Romney opened the doors to Bain Capital in 1984, the S&P 500 stood at 160. By the time he answered the call to duty in Salt Lake City in early 1999, it had gone parabolic and reached 1270. This meant that had a modern Rip Van Winkle bought the S&P 500 index and held it through the 15 years in question, the annual return (with dividends) would have been a spectacular 17 percent. Bain did considerably better, of course, but the reason wasn’t business acumen.
  • The credentials that Romney proffers as evidence of his business acumen, in fact, mainly show that he hung around the basket during the greatest bull market in recorded history.
  • The Wall Street Journal examined 77 significant deals completed during that period based on fundraising documents from Bain, and the results are a perfect illustration of bull-market asymmetry. Overall, Bain generated an impressive $2.5 billion in investor gains on $1.1 billion in investments. But 10 of Bain’s deals accounted for 75 percent of the investor profits.
  • The secret was leverage, luck, inside baseball, and the peculiar asymmetrical dynamics of the leveraged gambling carried on by private-equity shops. LBO funds are invested as equity at the bottom of a company’s capital structure, which means that the lenders who provide 80 to 90 percent of the capital have no recourse to the private-equity sponsor if deals go bust. Accordingly, LBO funds can lose 1X (one times) their money on failed deals, but make 10X or even 50X on the occasional “home run.” During a period of rising markets, expanding valuation multiples, and abundant credit, the opportunity to “average up” the home runs with the 1X losses is considerable; it can generate a spectacular portfolio outcome.
  • By contrast, the 10 home runs generated profits of $1.8 billion on investments of only $250 million, yielding a spectacular return of 7X investment. Yet it is this handful of home runs that both make the Romney investment legend and also seal the indictment: they show that Bain Capital was a vehicle for leveraged speculation that was gifted immeasurably by the Greenspan bubble. It was a fortunate place where leverage got lucky, not a higher form of capitalist endeavor or training school for presidential aspirants.
  • The startling fact is that four of the 10 Bain Capital home runs ended up in bankruptcy, and for an obvious reason: Bain got its money out at the top of the Greenspan boom in the late 1990s and then these companies hit the wall during the 2000-02 downturn, weighed down by the massive load of debt Bain had bequeathed them. In fact, nearly $600 million, or one third of the profits earned by the home-run companies, had been extracted from the hide of these four eventual debt zombies.
  • The bankruptcy forced the closure of about 250—or 40 percent—of the company’s stores and the loss of about 5,000 jobs. Yet the moral of the Stage Stores saga is not simply that in this instance Bain Capital was a jobs destroyer, not a jobs creator. The larger point is that it is actually a tale of Wall Street speculators toying with Main Street properties in defiance of sound finance—an anti-Schumpeterian project that used state-subsidized debt to milk cash from stores that would not have otherwise survived on the free market.
  • Ironically, the businesses and jobs that Staples eliminated were the office-supply counterparts of the cracker-box stores selling shoes, shirts, and dresses that Bain kept on artificial life-support at Stage Stores Inc. At length, Wal-Mart eliminated these jobs and replaced them with back-of–the-store automation and front-end part-timers, as did Staples, which now has 40,000 part-time employees out of its approximate 90,000 total head count. The pointless exercise of counting jobs won and lost owing to these epochal shifts on the free market is obviously irrelevant to the job of being president, but the fact that Bain made $15 million from the winner and $175 million from the loser is evidence that it did not make a fortune all on its own. It had considerable help from the Easy Button at the Fed.
  • The lesson is that LBOs are just another legal (and risky) way for speculators to make money, but they are dangerous because when they fail, they leave needless economic disruption and job losses in their wake. That’s why LBOs would be rare in an honest free market—it’s only cheap debt, interest deductions, and ludicrously low capital-gains taxes that artifically fuel them.
  • The larger point is that Romney’s personal experience in the nation’s financial casinos is no mark against his character or competence. I’ve made money and lost it and know what it is like to be judged. But that experience doesn’t translate into answers on the great public issues before the nation, either. The Romney campaign’s feckless narrative that private equity generates real economic efficiency and societal wealth is dead wrong.
  • The Bain Capital investments here reviewed accounted for $1.4 billion or 60 percent of the fund’s profits over 15 years, by my calculations. Four of them ended in bankruptcy; one was an inside job and fast flip; one was essentially a massive M&A brokerage fee; and the seventh and largest gain—the Italian Job—amounted to a veritable freak of financial nature.
  • In short, this is a record about a dangerous form of leveraged gambling that has been enabled by the failed central banking and taxing policies of the state. That it should be offered as evidence that Mitt Romney is a deeply experienced capitalist entrepreneur and job creator is surely a testament to the financial deformations of our times.
rachelramirez

Chattanooga All-Girls Charter School's Path to Success - The Atlantic - 0 views

  • The Key Ingredient to Fixing a Failing School
  • More than 90 percent of its students are black or Latino. Nearly all are low income. The school’s brochure says it was founded “to improve educational opportunities for low-income, underserved girls in Hamilton County.”
  • And critics can point to research published in Science magazine that suggests single-sex schools don’t foster better academic outcomes and accuse charters of pulling resources away from neighborhood schools.
  • ...14 more annotations...
  • Elaine Swafford was hired as CGLA’s executive director in 2012 and given less than a year to transform the then-failing school, which had launched several years earlier as the state’s first single-gender public charter school and then tanked.
  • Math proficiency was in the single digits. Few students had solid prospects for the future.
  • . Last year, the school achieved a 93 percent graduation rate and nearly every graduate went on to college.
  • Over the last decade, new research has increasingly suggested that strong school leaders are a crucial component of success, and even that turning around failing schools is virtually impossible without a strong leader at the helm.
  • On an adjoining wall, hundreds of magnets bearing individual names track each girl’s proficiency in a range of subjects, from below basic to advanced.
  • When she took over, half of the school’s teachers turned over after she informed the staff they would need to reapply for their jobs.
  • Swafford’s way of thinking is based on the idea that if she can’t deliver the well-rounded education both in and out of the classroom that middle-class kids get, her students will never catch up to their more affluent peers
  • Each student at CGLA is assigned an adult mentor and Swafford recently hired a college-access counselor to help students work through the federal financial-aid application and to stay connected with recent graduates
  • After she recently discovered that 85 of the school’s 350 students don’t have internet access at home, Swafford set about securing a hotspot for each wifi-less child.
  • The school receives some extra funding from the government because it serves such a high number of low-income students, but it’s not nearly enough to pay for all of the programs the school offers, so Swafford has gotten very good at raising money
  • Twenty-nine percent of the school’s budget is from fundraising, she told me, which helps fill a $4,500-per-student gap in funding.
  • Teachers are expected to believe that every child is capable of success and then help them achieve it by doing whatever it takes, regardless of any obstacles
  • The school recently implemented “grit” grading, and, Swafford acknowledged unapologetically, “everybody’s not happy about it.”
  • “grit” grading
Javier E

What a Failed Trump Administration Looks Like - The New York Times - 0 views

  • President Trump’s mental state is like a train that long ago left freewheeling and iconoclastic, has raced through indulgent, chaotic and unnerving, and is now careening past unhinged, unmoored and unglued.
  • Trump’s White House staff is at war with itself. His poll ratings are falling at unprecedented speed. His policy agenda is stalled. F.B.I. investigations are just beginning. This does not feel like a sustainable operation.
  • On the other hand, I have trouble seeing exactly how this administration ends. Many of the institutions that would normally ease out or remove a failing president no longer exist.
  • ...13 more annotations...
  • There are no longer moral arbiters in Congress like Howard Baker and Sam Ervin to lead a resignation or impeachment process. There is no longer a single media establishment that shapes how the country sees the president. This is no longer a country in which everybody experiences the same reality.
  • Everything about Trump that appalls 65 percent of America strengthens him with the other 35 percent, and he can ride that group for a while. Even after these horrible four weeks, Republicans on Capitol Hill are not close to abandoning their man.
  • The likelihood is this: We’re going to have an administration that has morally and politically collapsed, without actually going away.
  • To get anything done, a president depends on the vast machinery of the U.S. government. But Trump doesn’t mesh with that machinery. He is personality-based while it is rule-based. Furthermore, he’s declared war on it. And when you declare war on the establishment, it declares war on you.
  • Second, this will probably become a more insular administration. Usually when administrations stumble, they fire a few people and bring in the grown-ups
  • But Trump is anti-grown-up
  • the circle of trust seems to be shrinking to his daughter, her husband and Stephen Bannon.
  • In an administration in which “promoted beyond his capacity” takes on new meaning, Bannon looms. With each passing day, Trump talks more like Bannon without the background reading.
  • Third, we are about to enter a decentralized world. For the past 70 years most nations have instinctively looked to the U.S. for leadership, either to follow or oppose. But in capitals around the world, intelligence agencies are drafting memos with advice on how to play Donald Trump. Advertisement Continue reading the main story
  • The first conclusion is obvious. This administration is more like a medieval monarchy than a modern nation-state. It’s more “The Madness of King George” than “The Missiles of October.” The key currency is not power, it’s flattery.
  • If you want to roll the Trump administration, you’ve got to get in line. The Israelis got a possible one-state solution. The Chinese got Trump to flip-flop on the “One China” policy. The Europeans got him to do a 180 on undoing the Iran nuclear deal.
  • We’re about to enter a moment in which U.S. economic and military might is strong but U.S. political might is weak. Imagine the Roman Empire governed by Monaco.
  • The only saving thought is this: The human imagination is vast, but it is not nearly vast enough to encompass the infinitely multitudinous ways Donald Trump can find to get himself disgraced.
Ellie McGinnis

How John Kerry Could End Up Outdoing Hillary Clinton - David Rohde - The Atlantic - 0 views

  • Kerry’s first foreign-policy speech as secretary, an hour-long defense of diplomacy and foreign aid, was a flop.
  • The nearly universal expectation was that Kerry’s tenure would be overshadowed by his predecessor’s, for a long list of reasons.
  • arriving in Foggy Bottom when the country seemed to be withdrawing from the world. Exhausted by two long wars, Americans were wary of ambitious new foreign engagements—certainly of military ones, but of entangling diplomatic ones, too
  • ...89 more annotations...
  • Barack Obama’s administration, accelerating a process that had begun in the early 1960s under President Kennedy, was centralizing foreign-policy decision making in the White House’s National Security Council, marginalizing the State Department.
  • Finally, Kerry, a defeated presidential candidate, was devoid of the sexiness that automatically attaches to a figure, like Hillary Clinton, who remains a legitimate presidential prospect
  • The consensus in Washington was that Kerry was a boring if not irrelevant man stepping into what was becoming a boring, irrelevant job.
  • Nearly a year into his tenure, Kerry is the driving force behind a flurry of Mideast diplomacy the scope of which has not been seen in years. In the face of widespread skepticism, he has revived the Israeli-Palestinian peace process; brokered a deal with Russia to remove chemical weapons from Syria; embarked on a new round of nuclear talks with Iran, holding the highest-level face-to-face talks with Iranian diplomats in years; and started hammering out a new post-withdrawal security agreement with Afghan President Hamid Karzai.
  • it will be Kerry who is credited with making the State Department relevant again.
  • “He’s front and center on all these issues. That clearly represents a very ambitious first year for any secretary of state.”
  • Kerry has a bad habit of wandering off script. On a trip to Pakistan in August, he created two diplomatic incidents in a single interview. First he said that the Egyptian army was “restoring democracy” when it toppled the country’s democratically elected president.
  • President Obama had “a timeline” for ending U.S. drone strikes in Pakistan.
  • he overshot in the opposite direction, promising that any American strike against Syria would be “unbelievably small”—a bit of impressively self-defeating rhetoric that undermined days of administration efforts to argue that a few dozen Tomahawk cruise-missile strikes would be more than what hawkish critics were calling a pointless “pinprick.”
  • a word that comes up frequently in conversations about Kerry is gasbag. He had few close friends in the Senate, where he served for nearly 30 years. A former diplomat says Kerry’s recent foreign-policy successes have made him more insufferable than ever.
  • his gaffes are caused by arrogance and indiscipline. They say that even in a city swollen with egotism and pomposity, Kerry stands out.
  • “Nobody would challenge the notion that he’s been very much a team player and willing to take on really hard assignments from the president and go to the toughest places.”
  • (In one late-night press conference in Moscow last May, he uttered a staggering 95-word sentence.
  • “Even as a junior or senior, he was a pompous blowhard,” says someone who attended Yale with Kerry in the 1960s and asked not to be named.
  • he is not so much arrogant as awkward.
  • Liberal Democrats call his hawkish views on Syria a betrayal of his antiwar past. Republicans say he is a perennial flip-flopper: he fought in the Vietnam War and then protested against it; he supported the 2003 invasion of Iraq and then opposed it; he tried to negotiate with Bashar al‑Assad in 2009, then compared him to Adolf Hitler—and then reopened the door to negotiating with him again.
  • Kerry “just can’t dance.”
  • Washington mandarins dismiss Kerry’s foreign-policy ambitions as grandiose and overweening, especially relative to what America’s diminishing power can achieve after Iraq and Afghanistan
  • old foreign-policy hands say that instead of acknowledging the limits of American power in the post–Arab Spring Middle East, Kerry looks for misguided ways to apply power the country no longer has.
  • Current aides argue that Kerry’s recent successes belie the caricatures of him. “Show me where he hasn’t done this job well,” one demanded when I interviewed him in mid-October.
  • “I would ask John Kerry, ‘How can you ask a man to be the first one to die for a mistake?’ ”
  • Kerry seem “pompous” is that “oftentimes he tries too hard.” According to Manley and others, Kerry had a knack for walking up to fellow members on the Senate floor at precisely the wrong time.
  • His enormous ambition motivates him to aim for major breakthroughs despite daunting odds. And his healthy self-confidence allows him to believe that he can convince anyone of virtually anything.
  • Kerry also has bottomless reserves of patience that allow him to engage for hours in seemingly fruitless negotiations; he persists long past the time others would have given up in exhaustion.
  • The amount of time he’s spent negotiating with Afghanistan’s Hamid Karzai and Russia’s Sergey Lavrov alone should qualify him for some kind of diplomatic medal of honor.
  • an indifference to his own political standing.
  • Political calculations may have constrained the risks Hillary Clinton was willing to take. Kerry, in contrast, no longer needs to heed political consultants. Nor does he need to worry too much about what his detractors say.
  • “I don’t care at all,” he said. “I could care less about it. You know, David, I ran for president, so I’m not scared of failure.”
  • secretary of state is the job for which Kerry was born and bred
  • “I’m not worried about the politics,” Lowenstein recalls Kerry telling him. “I want to get things done.”
  • Obama, too, no longer has to worry about reelection; concerns about the 2012 election may have limited the president’s own appetite for diplomatic risk taking in the Mideast during his first term.
  • But his enthusiasm for his current job is unquestionable; one aide told me that he will have to be dragged from the office—fingernails scraping against the floor—at the end of his term.
  • As a presidential candidate, he had to downplay his obsession with foreign policy and his fluency in foreign languages, for fear that such things would play badly with voters; as secretary of state, he can freely leverage those qualities.
  • if there is no breakthrough with Iran, or if his efforts to broker peace in Syria fall short, or if the Israeli-Palestinian peace talks founder, history will likely view Kerry as the tragicomic figure his detractors already judge him to be.
  • “After you lose the presidency, you don’t have much else to lose.”
  • Following stints as an assistant district attorney and the lieutenant governor of Massachusetts, Kerry would, after his election to the Senate in 1984, go on to serve for 28 years on the same committee he had stood before in 1971, the Senate Foreign Relations Committee
  • (But for Ohio, where he lost to Bush by 119,000 votes, Kerry would have been president.
  • But Kerry stepped into the role at a singularly weak moment for the position. For one thing, America, weary after a decade of conflict, is turning inward; activist diplomacy is out of favor. For another, State Department employees I interviewed told me that morale is low.
  • the department is too hierarchical, inflexible, and risk-averse—and is in danger of becoming even more so in the aftermath of Benghazi.
  • the intensely controlling Obama administration has centralized foreign-policy decision making in the National Security Council, weakening the State Department.
  • Just a day after Kerry delivered one of the most impassioned speeches of his career, assailing Assad’s use of chemical weapons on civilians as a “crime against conscience” and sending a clear signal that U.S. air strikes on Syria were imminent, the president announced that missile strikes might in fact not be imminent, and that he would be seeking congressional authorization to attack Syria.
  • the president risked causing foreign leaders and negotiators to doubt whether any future warnings or statements issued by Kerry were backed by the White House.
  • Kerry conducted long interviews with every living former secretary of state—Kissinger, George Shultz, Baker, Madeleine Albright, Powell, Condoleezza Rice, and Clinton—and set out to model himself after Shultz, who, in six and a half years serving under Ronald Reagan, was seen as a combination of the two prototypes, both a great diplomat and a good manager.
  • “I don’t care about risk, honestly,” he said, leaning forward in his chair, spoiling for a fight. “The riskiest thing to do is to not act. I would far rather try and fail than fail not trying.”
  • When off the record, in relaxed settings, he is refreshingly direct, profane, and insightful, speaking bluntly about the limits of American power and caustically lamenting Washington’s growing paralysis and partisanship
  • He finishes sentences with phrases such as something like that or that’s about it or thanks, man. Toes tapping, head bobbing back and forth, he speaks with fervor and candor. His tenacity is palpable.
  • Recent secretaries of state have had different strengths. Henry Kissinger and James Baker, two secretaries who had close relationships with their presidents (Nixon in Kissinger’s case, George H. W. Bush in Baker’s), were powerful bureaucratic players.
  • But isn’t staking America’s credibility, and his own reputation, on long-odds breakthrough agreements with Tehran or Moscow, or on Israeli-Palestinian peace talks, a dubious exercise, as Obama’s failed first-term efforts at Mideast peace demonstrated?
  • Colin Powell lost a crucial internal administration battle in failing to halt the Bush White House’s march to war in Iraq—but was adored at the State Department for implementing sweeping administrative reforms.
  • Clinton embraced a new, Google Hangout era of town-hall diplomacy, and she elevated economic development and women’s issues. She was an architect of the administration’s “pivot to Asia,” and she took risks in supporting the Afghanistan troop surge and the intervention in Libya.
  • steered clear of the Middle East, delegating special envoys like Richard Holbrooke and George Mitchell to grapple with the Israeli-Palestinian peace process, peace talks with the Taliban
  • Clinton was much more prudent and careful than Kerry, whom one former State Department official describes as more of a “high-risk, high-reward”
  • “My view is that she was pretty sheltered,” he told me. “They were not interpersonally pleasant, and they were very protective of her. You can get into a cocoon.”
  • “My assessment was that she made a calculated political choice not to hang her hat on that thankless task,” Kim Ghattas,
  • the former secretary would have taken bolder risks but was reined in by the White House—especially during her first couple of years in office, when hostility from the bitter 2008 primary campaign still lingered between the Obama and Clinton staffs.
  • she actively engaged in Middle East talks, at one point meeting with Israeli Prime Minister Benjamin Netanyahu for seven hours in New York.
  • Kennan warned Powell about the dangers of traveling too much—of prioritizing activist diplomacy over providing the White House with solid foreign-policy analysis.
  • Powell gave a copy of Kennan’s letter to Kerry. So far, Kerry is not following the advice. As October came to a close, Kerry had already flown more than 213,000 miles and spent more than 100 days—roughly 40 percent of his time—outside the United States. In his first nine months, he’d traveled more miles than Clinton had in her entire first year in office.
  • In 2009, he convinced Afghan President Hamid Karzai to consent to a runoff in his country’s disputed presidential election.
  • 2011, he was dispatched to Pakistan after the killing of Osama bin Laden to persuade local officials to return the tail of an American helicopter that had crashed at the site.
  • cemented Kerry’s bond with Obama was less his diplomatic achievements than his ability to impersonate another tall, wealthy Massachusetts politician with good hair: Kerry served as Mitt Romney’s surrogate during weeks of preparation for the 2012 presidential debates.
  • Kerry channeled Romney so effectively that, aides to both men say, he got under Obama’s skin.
  • Kerry agreed that the U.S. should try to revive Middle East negotiations before the Palestinians again pushed for statehood, at the United Nations General Assembly in September 2013.
  • In private meetings with Israeli Prime Minister Netanyahu and Palestinian President Mahmoud Abbas, Obama pushed for a resumption of negotiations. At a final press conference before returning to Washington, Obama announced that he was handing the pursuit of talks over to Kerry.
  • “What I can guarantee is that Secretary Kerry is going to be spending a good deal of time in discussions with the parties.”
  • He met alone with Abbas for two hours in Amman and then flew to Jerusalem to meet with Netanyahu and three of his aides.
  • Kerry pressed on, returning in April to Jerusalem and Ramallah, the de facto Palestinian capital in the West Bank. After 24 hours of talks with both sides, Kerry held a press conference at the airport in Tel Aviv.
  • Kerry held three meetings with Netanyahu and Abbas in three days, including one meeting with the Israeli prime minister that lasted six hours, until 3 a.m. On June 29, he canceled a trip to the United Arab Emirates so he could keep talking with Netanyahu and Abbas, raising expectations of a breakthrough. On June 30, he held another press conference at the Tel Aviv airport.
  • “We started out with very wide gaps, and we have narrowed those considerably.”
  • Five months into the job, Kerry was off to an ominous start. His wife was in the hospital. Syria was convulsing. Progress toward Israeli-Palestinian talks was stalled. Egypt was burning. And Republican attack ads were making it appear as though the secretary of state had spent the weekend yachting.
  • Kerry said, according to the aide. “The only thing I’m interested in is a serious negotiation that can lead to a final-status agreement.”
  • “On behalf of President Obama, I am pleased to announce that we have reached an agreement that establishes a basis for resuming direct final-status negotiations between the Palestinians and the Israelis,” Kerry said, calmly and deliberately. “This is a significant and welcome step forward.” He declined to take questions.
  • Nine days later, the Israeli cabinet approved the release of the 104 Palestinian prisoners. The next day, Israeli and Palestinian officials arrived in Washington to begin peace talks.
  • The smallness of his circle of aides, which had been seen early on as a detriment to his management of the State Department, now made it easier to keep information contained.
  • Working with consultants from McKinsey, diplomats estimated that $4 billion in long-term private investment would flow to the Palestinians in the wake of an agreement.
  • Palestinian officials appear to have compromised on their demand for a settlement freeze.
  • From the beginning, Kerry had insisted that the Obama administration not allow a halt in Israeli settlement construction to become a public precondition.
  • Kerry also reiterated a core argument: the security that Israel currently enjoys is temporary, if not illusory. Without a two-state solution, Israel will face a European-led campaign of delegitimization, a new intifada, and a Palestinian leader far more radical than Abbas.
  • The crucial concession—the release of the 104 prisoners—came from the Israeli side
  • “It takes time to listen, it takes time to persuade,” Frank Lowenstein told me. “This is where Kerry’s willingness to stay up all night pays off.”
  • The U.S. provided nonlethal aid to the opposition, but White House officials were so fearful of American assistance inadvertently falling into the hands of jihadists that the National Security Council Deputies Committee monitored the distribution of the aid in granular detail. Qatar and Saudi Arabia, meanwhile, were funneling cash and weapons to hard-line militants, including Al Nusra Front, an al-Qaeda affiliate.
  • Russia continued providing Syria with arms and blocking any action by the UN Security Council.
  • When Putin finally received Kerry, after a three-hour delay, Putin reportedly fiddled continuously with his pen and “more resembled a man indulging a long-ago scheduled visit from the cultural attaché of Papua New Guinea than participating in an urgent summit with America’s top diplomat,”
  • At a late-night press conference, a beaming Kerry announced that he and Lavrov would co-host a peace conference in Geneva.
  • “They were great efforts, and again, I reiterate my gratitude to President Putin for a very generous welcome here.”
  • Earlier, in April, after American intelligence officials had confirmed that Assad had carried out several small-scale chemical-weapons attacks, Obama had reluctantly agreed to mount a covert CIA effort to arm and train moderate rebels.
  • if the United States did not “impose consequences” for Assad’s use of chemical weapons, the Syrian leader would see it as “a green light for continued CW use.” But the White House did not alter course.
  • Both Obama and Kerry favored a military response—air strikes—according to a senior administration official. As American intelligence agencies accumulated evidence suggesting that Assad was responsible, Kerry offered to make the public case for strikes. White House officials welcomed the idea and vetted his speeches.
  • “My vision is that, if you can make peace, if you can get Israel and Palestine resolved and can get the Iranian threat of a nuclear weapon put to bed appropriately—even if Syria didn’t calm down—if you get those two pieces or one piece of that, you’ve got a hugely changed dynamic in a region that is in turmoil. And if you take just the Palestinian-Israeli situation, you have the potential to make peace with 57 nations—35 Muslim nations and 22 Arab nations. If the issue is resolved, they will recognize Israel.”
Javier E

Failure Is an Option: Does History Forecast Disaster for the United States? - The Atlantic - 1 views

  • it is clear that human societies do not progress inevitably toward greater wealth. Creating the conditions in which self-interest will foster economic development is harder than optimistic Enlightenment thinkers believed. Economic growth is not predestined: Many countries have seen long-term declines in standards of living, as did Argentina in the twentieth century. Others, such as large parts of Africa, seem mired in strife and poverty. With even the United States and Western Europe facing economic stagnation, burdensome debt levels, unfavorable demographics, and rising global competition, it seems that sustained stability and prosperity may be the historical exception rather than the rule.
  • Why some societies stagnate while others thrive is the question addressed by economist Daron Acemoglu and political scientist James Robinson in Why Nations Fail: The Origins of Power, Prosperity, and Poverty.
  • differences, Acemoglu and Robinson argue, can all be explained by institutions. Long-lasting institutions, not short-term government policies, are the key determinant of societal outcomes. Development is not as simple as adopting a smarter set of economic policies: Instead, "the main obstacle to the adoption of policies that would reduce market failures and encourage economic growth is not the ignorance of politicians but the incentives and constraints they face from the political and economic institutions in their societies."
  • ...30 more annotations...
  • Acemoglu and Robinson outline a theory of how economic and political institutions shape the fate of human societies. They reinterpret the rise and fall of civilizations throughout history, showing how differences in institutions interact with changing circumstances to produce development or stagnation.
  • It also has implications for the contemporary United States, where increasing inequality and the growing influence of money in politics threaten to reshape our political institutions.
  • In more fortunate countries, pluralistic political institutions prevent any one group from monopolizing resources for itself, while free markets empower a large class of people with an interest in defending the current system against absolutism. This virtuous circle, which first took form in seventeenth-century England, is the secret to economic growth.
  • Economic institutions are themselves the products of political processes, which depend on political institutions. These can also be extractive, if they enable an elite to maintain its dominance over society, or inclusive, if many groups have access to the political process. Poverty is not an accident: "[P]oor countries are poor because those who have power make choices that create poverty." Therefore, Acemoglu and Robinson argue, it is ultimately politics that matters.
  • The logic of extractive and inclusive institutions explains why growth is not foreordained. Where a cohesive elite can use its political dominance to get rich at the expense of ordinary people, it has no need for markets and free enterprise, which can create political competitors. In addition, because control of the state can be highly lucrative, infighting among contenders for power produces instability and violence. This vicious circle keeps societies poor
  • Countries differ in their economic success because of their different institutions, the rules influencing how the economy works, and the incentives that motivate people," write Acemoglu and Robinson. Extractive institutions, whether feudalism in medieval Europe or the use of schoolchildren to harvest cotton in contemporary Uzbekistan, transfer wealth from the masses to elites. In contrast, inclusive institutions -- based on property rights, the rule of law, equal provision of public services, and free economic choices -- create incentives for citizens to gain skills, make capital investments, and pursue technological innovation, all of which increase productivity and generate wealth.
  • Acemoglu and Robinson differentiate their account from alternatives that they label the "culture," "geography," and "ignorance" hypotheses.
  • An example of the first is Max Weber's famous argument that Calvinism lay at the roots of capitalist development
  • the best-known recent example of the second is Jared Diamond's explanation of the Spanish Conquest as the inevitable outcome of geographic differences between Eurasia and the Americas.
  • Most economists, Acemoglu and Robinson assert, subscribe to the ignorance hypothesis, according to which "poor countries are poor because they have a lot of market failures and because economists and policymakers do not know how to get rid of them." According to this view, development can be engineered through technocratic policies administered by enlightened experts.
  • this focus on policy obscures the fundamental importance of politics.
  • Their perspective is informed by New Institutional Economics, an approach developed in the last quarter of the twentieth century, and associated with prominent economists such as Douglass North and Oliver Williamson, that focuses on how economic forces are mediated by institutions such as political systems and legal codes
  • A state based on extractive institutions, whether the Kuba Kingdom of seventeenth-century Central Africa or more recently the Soviet Union, can generate growth, especially when starting from low levels of development. But in most of these cases, the ruling elite is unwilling to allow inclusive economic institutions because they would threaten its political supremacy; the inevitable result is economic stagnation.
  • This leaves open the question of why some societies end up with inclusive rather than extractive institutions -- why some are rich and some are poor. The answer, according to Acemoglu and Robinson, is that institutions evolve -- and that history is messy.
  • Institutions change in subtle ways over time, allowing societies to drift apart. When major shocks occur, small differences in institutions can send societies down vastly different historical paths.
  • Early modern England, France, and Spain were all feudal societies with power-hungry monarchs. But the English Parliament had slightly more power than its continental relatives; as a result, the crown was unable to monopolize trade with the Americas, which made many merchants rich instead; in turn, this new commercial class became an important part of the coalition that overthrew James II in 1688, successfully fighting off absolutism. In Spain, by contrast, the monarchy controlled overseas trade, quashed internal challenges to its authority, and maintained extractive economic institutions -- and the country went into long-term decline. Crucially, Acemoglu and Robinson remind us that these outcomes were not preordained. James II might have suppressed the Glorious Revolution, or the Spanish Armada might have succeeded a century earlier. History is like that.
  • In this light, the material prosperity of the modern world, unevenly distributed though it is, is a fortunate historical accident.
  • But inclusive institutions can also break down. In the late thirteenth and early fourteenth centuries, a small group of families transformed Venice's semi-democratic institutions into a hereditary aristocracy and then monopolized long-distance trade, spelling the end of the city-state's economic expansion
  • Acemoglu and Robinson, by contrast, examine why nations fail. Societies, in their telling, are like Tolstoy's families: The success stories are similar -- pluralist democracies with regulated capitalist economies -- but failure comes in different forms. There are many ways in which elites can impose extractive institutions that cripple economic development.
  • The United States is one of the happy families of Why Nations Fail. Although our institutions have often been deeply flawed, Acemoglu and Robinson show how crucial moments in history, from Jamestown to the Progressive Era to the civil-rights movement, have led to the expansion of political democracy and economic opportunity.
  • Rather than as a series of inevitable triumphs, however, this history can also be seen as a warning -- that our institutions are fragile, always at risk of being subverted by elites seeking to exploit political power for their narrow economic ends. That risk has reappeared today.
  • The power of the financial sector is only one example of the broader threat to our inclusive political institutions: namely, the ability of the economic elite to translate their enormous fortunes directly into political power. In the wake of the Supreme Court's 2010 decision in Citizens United, super PACs can mobilize unlimited amounts of money--and can accept contributions from 501(c)4 organizations, which do not have to identify their donors.
  • This may seem like a level playing field. But money is not distributed evenly. American Crossroads, for example, has consistently raised more than 90 percent of its funds from billionaires (with a "b"). The recent, breathtaking rise in inequality has put unprecedented resources at the disposal of the super-rich. With the ability to secretly invest unlimited sums in political activities, they now have the opportunity to swamp all other participants in American politics.
  • Rising inequality and deregulation of political spending have made possible a new kind of class warfare. The 1 percent can blanket the airwaves, install their chosen representatives, and sway public policy in their favor.
  • The most direct way to translate political power into cold, hard cash is to advocate for lower taxes. Republican presidential candidates spent the past year competing to offer the most bountiful tax cuts to the super-rich
  • Showering goodies on the rich would require draconian cuts to Social Security and Medicare -- programs that are popular among the Tea Party rank and file. Republicans' current anti-tax orthodoxy reflects the interests of their wealthy funders rather than their middle-income base.
  • As Warren Buffett observed, "there's been class warfare going on for the last twenty years, and my class has won." This should be little surprise: "My side has had the nuclear bomb. We've got K Street, we've got lobbyists, we've got money on our side."
  • Supreme Court justices appointed by Republican presidents were instrumental in unleashing unlimited corporate political spending in Citizens United, accelerating the concentration of political power in the hands of the super-rich.
  • The most potent bulwark of inclusive institutions is probably the rich variety of influential interest groups that all have the ability to participate in politics. Still, the accumulation of huge fortunes and their deployment for political ends has changed the nature of our political institutions. Funding by the economic elite is a major reason why Republicans advocate transfers from ordinary people to the rich in the form of tax cuts and reductions in government services -- and why Democrats have been dragged to the right along with the GOP
  • Acemoglu recently said, "We need noisy grassroots movements to deliver a shock to the political system," citing both the Tea Party and Occupy Wall Street as potentially helpful developments. As he recognized, however, the one with more staying power -- the Tea Party -- has been co-opted by well-funded, elite-dominated groups (including Americans for Prosperity). If a popular movement can be bankrolled as easily as an attack ad, it is hard to see what money can't buy in politics. The next test for America will be whether our political system can fend off the power of money and remain something resembling a real democracy -- or whether it will become a playground where a privileged elite works out its internal squabbles.
Javier E

Opinion | A Cheer for Italy's Awful New Government - The New York Times - 0 views

  • They are on to something, and that is why they won, just as Trump won because he intuited a seeping anger that too many liberals had ignored.
  • They are right that almost three decades of globalization since the end of the Cold War has left too many people behind in too many Western democracies, starved them of hope or even a say, and given them the impression that the system was rigged by elites in Brussels or other metropolitan hubs.
  • The 2008 financial meltdown and the subsequent euro crisis came and went with near total impunity for those responsible. Until Western democracies confront their failings, the tide of popular rage won’t abate.
  • ...4 more annotations...
  • The European Union has failed Italy because promised solidarity in taking in immigrants reaching Europe through Mediterranean routes hardly materialized. In 2017, Italy received over 60 percent of such migrants
  • It has failed Italy because the rigid fiscal constraints of membership of the euro — set up to ensure that Italy’s budgetary laxness and administrative inefficiency would not be a problem for Germans — have proved unsustainable, engendering growing resentment toward Chancellor Angela Merkel.
  • let Salvini and Di Maio and Giuseppe Conte, the new prime minister whose inflation of his academic credentials is not reassuring, go to work on the mess. It’s much better to have them fail on the inside than have them rail from the outside. It’s better to have them lose support through failure than gain support through bluster.
  • a core beauty of the European Union is that its interlocking institutions are designed precisely to ensure that no country can go off on what the Germans call a Sonderweg — the sort of wayward path of nationalism and mysticism and racism that led Germany, and all of Europe, to ruin.
1 - 20 of 1536 Next › Last »
Showing 20 items per page