Skip to main content

Home/ History Readings/ Group items tagged positive-controversy

Rss Feed Group items tagged

Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

President Obama's Interview With Jeffrey Goldberg on Syria and Foreign Policy - The Atl... - 0 views

  • The president believes that Churchillian rhetoric and, more to the point, Churchillian habits of thought, helped bring his predecessor, George W. Bush, to ruinous war in Iraq.
  • Obama entered the White House bent on getting out of Iraq and Afghanistan; he was not seeking new dragons to slay. And he was particularly mindful of promising victory in conflicts he believed to be unwinnable. “If you were to say, for instance, that we’re going to rid Afghanistan of the Taliban and build a prosperous democracy instead, the president is aware that someone, seven years later, is going to hold you to that promise,” Ben Rhodes, Obama’s deputy national-security adviser, and his foreign-policy amanuensis, told me not long ago.
  • Power is a partisan of the doctrine known as “responsibility to protect,” which holds that sovereignty should not be considered inviolate when a country is slaughtering its own citizens. She lobbied him to endorse this doctrine in the speech he delivered when he accepted the Nobel Peace Prize in 2009, but he declined. Obama generally does not believe a president should place American soldiers at great risk in order to prevent humanitarian disasters, unless those disasters pose a direct security threat to the United States.
  • ...162 more annotations...
  • Obama’s resistance to direct intervention only grew. After several months of deliberation, he authorized the CIA to train and fund Syrian rebels, but he also shared the outlook of his former defense secretary, Robert Gates, who had routinely asked in meetings, “Shouldn’t we finish up the two wars we have before we look for another?”
  • In his first term, he came to believe that only a handful of threats in the Middle East conceivably warranted direct U.S. military intervention. These included the threat posed by al‑Qaeda; threats to the continued existence of Israel (“It would be a moral failing for me as president of the United States” not to defend Israel, he once told me); and, not unrelated to Israel’s security, the threat posed by a nuclear-armed Iran.
  • Bush and Scowcroft removed Saddam Hussein’s army from Kuwait in 1991, and they deftly managed the disintegration of the Soviet Union; Scowcroft also, on Bush’s behalf, toasted the leaders of China shortly after the slaughter in Tiananmen Square.
  • As Obama was writing his campaign manifesto, The Audacity of Hope, in 2006, Susan Rice, then an informal adviser, felt it necessary to remind him to include at least one line of praise for the foreign policy of President Bill Clinton, to partially balance the praise he showered on Bush and Scowcroft.
  • “When you have a professional army,” he once told me, “that is well armed and sponsored by two large states”—Iran and Russia—“who have huge stakes in this, and they are fighting against a farmer, a carpenter, an engineer who started out as protesters and suddenly now see themselves in the midst of a civil conflict …” He paused. “The notion that we could have—in a clean way that didn’t commit U.S. military forces—changed the equation on the ground there was never true.”
  • The message Obama telegraphed in speeches and interviews was clear: He would not end up like the second President Bush—a president who became tragically overextended in the Middle East, whose decisions filled the wards of Walter Reed with grievously wounded soldiers, who was helpless to stop the obliteration of his reputation, even when he recalibrated his policies in his second term. Obama would say privately that the first task of an American president in the post-Bush international arena was “Don’t do stupid shit.”
  • Hillary Clinton, when she was Obama’s secretary of state, argued for an early and assertive response to Assad’s violence. In 2014, after she left office, Clinton told me that “the failure to help build up a credible fighting force of the people who were the originators of the protests against Assad … left a big vacuum, which the jihadists have now filled.” When The Atlantic published this statement, and also published Clinton’s assessment that “great nations need organizing principles, and ‘Don’t do stupid stuff’ is not an organizing principle,” Obama became “rip-shit angry,” according to one of his senior advisers. The president did not understand how “Don’t do stupid shit” could be considered a controversial slogan.
  • The Iraq invasion, Obama believed, should have taught Democratic interventionists like Clinton, who had voted for its authorization, the dangers of doing stupid shit. (Clinton quickly apologized to Obama for her comments,
  • Obama, unlike liberal interventionists, is an admirer of the foreign-policy realism of President George H. W. Bush and, in particular, of Bush’s national-security adviser, Brent Scowcroft (“I love that guy,” Obama once told me).
  • The danger to the United States posed by the Assad regime did not rise to the level of these challenges.
  • Obama generally believes that the Washington foreign-policy establishment, which he secretly disdains, makes a fetish of “credibility”—particularly the sort of credibility purchased with force. The preservation of credibility, he says, led to Vietnam. Within the White House, Obama would argue that “dropping bombs on someone to prove that you’re willing to drop bombs on someone is just about the worst reason to use force.”
  • American national-security credibility, as it is conventionally understood in the Pentagon, the State Department, and the cluster of think tanks headquartered within walking distance of the White House, is an intangible yet potent force—one that, when properly nurtured, keeps America’s friends feeling secure and keeps the international order stable.
  • All week, White House officials had publicly built the case that Assad had committed a crime against humanity. Kerry’s speech would mark the culmination of this campaign.
  • But the president had grown queasy. In the days after the gassing of Ghouta, Obama would later tell me, he found himself recoiling from the idea of an attack unsanctioned by international law or by Congress. The American people seemed unenthusiastic about a Syria intervention; so too did one of the few foreign leaders Obama respects, Angela Merkel, the German chancellor. She told him that her country would not participate in a Syria campaign. And in a stunning development, on Thursday, August 29, the British Parliament denied David Cameron its blessing for an attack. John Kerry later told me that when he heard that, “internally, I went, Oops.”
  • Obama was also unsettled by a surprise visit early in the week from James Clapper, his director of national intelligence, who interrupted the President’s Daily Brief, the threat report Obama receives each morning from Clapper’s analysts, to make clear that the intelligence on Syria’s use of sarin gas, while robust, was not a “slam dunk.” He chose the term carefully. Clapper, the chief of an intelligence community traumatized by its failures in the run-up to the Iraq War, was not going to overpromise, in the manner of the onetime CIA director George Tenet, who famously guaranteed George W. Bush a “slam dunk” in Iraq.
  • While the Pentagon and the White House’s national-security apparatuses were still moving toward war (John Kerry told me he was expecting a strike the day after his speech), the president had come to believe that he was walking into a trap—one laid both by allies and by adversaries, and by conventional expectations of what an American president is supposed to do.
  • Late on Friday afternoon, Obama determined that he was simply not prepared to authorize a strike. He asked McDonough, his chief of staff, to take a walk with him on the South Lawn of the White House. Obama did not choose McDonough randomly: He is the Obama aide most averse to U.S. military intervention, and someone who, in the words of one of his colleagues, “thinks in terms of traps.” Obama, ordinarily a preternaturally confident man, was looking for validation, and trying to devise ways to explain his change of heart, both to his own aides and to the public
  • The third, and most important, factor, he told me, was “our assessment that while we could inflict some damage on Assad, we could not, through a missile strike, eliminate the chemical weapons themselves, and what I would then face was the prospect of Assad having survived the strike and claiming he had successfully defied the United States, that the United States had acted unlawfully in the absence of a UN mandate, and that that would have potentially strengthened his hand rather than weakened it.
  • Others had difficulty fathoming how the president could reverse himself the day before a planned strike. Obama, however, was completely calm. “If you’ve been around him, you know when he’s ambivalent about something, when it’s a 51–49 decision,” Ben Rhodes told me. “But he was completely at ease.”
  • Obama also shared with McDonough a long-standing resentment: He was tired of watching Washington unthinkingly drift toward war in Muslim countries. Four years earlier, the president believed, the Pentagon had “jammed” him on a troop surge for Afghanistan. Now, on Syria, he was beginning to feel jammed again.
  • The fourth factor, he said, was of deeper philosophical importance. “This falls in the category of something that I had been brooding on for some time,” he said. “I had come into office with the strong belief that the scope of executive power in national-security issues is very broad, but not limitless.”
  • Obama’s decision caused tremors across Washington as well. John McCain and Lindsey Graham, the two leading Republican hawks in the Senate, had met with Obama in the White House earlier in the week and had been promised an attack. They were angered by the about-face. Damage was done even inside the administration. Neither Chuck Hagel, then the secretary of defense, nor John Kerry was in the Oval Office when the president informed his team of his thinking. Kerry would not learn about the change until later that evening. “I just got fucked over,” he told a friend shortly after talking to the president that night. (When I asked Kerry recently about that tumultuous night, he said, “I didn’t stop to analyze it. I figured the president had a reason to make a decision and, honestly, I understood his notion.”)
  • The president asked Congress to authorize the use of force—the irrepressible Kerry served as chief lobbyist—and it quickly became apparent in the White House that Congress had little interest in a strike. When I spoke with Biden recently about the red-line decision, he made special note of this fact. “It matters to have Congress with you, in terms of your ability to sustain what you set out to do,” he said. Obama “didn’t go to Congress to get himself off the hook. He had his doubts at that point, but he knew that if he was going to do anything, he better damn well have the public with him, or it would be a very short ride.” Congress’s clear ambivalence convinced Biden that Obama was correct to fear the slippery slope. “What happens when we get a plane shot down? Do we not go in and rescue?,” Biden asked. “You need the support of the American people.”
  • At the G20 summit in St. Petersburg, which was held the week after the Syria reversal, Obama pulled Putin aside, he recalled to me, and told the Russian president “that if he forced Assad to get rid of the chemical weapons, that that would eliminate the need for us taking a military strike.” Within weeks, Kerry, working with his Russian counterpart, Sergey Lavrov, would engineer the removal of most of Syria’s chemical-weapons arsenal—a program whose existence Assad until then had refused to even acknowledge.
  • The arrangement won the president praise from, of all people, Benjamin Netanyahu, the Israeli prime minister, with whom he has had a consistently contentious relationship. The removal of Syria’s chemical-weapons stockpiles represented “the one ray of light in a very dark region,” Netanyahu told me not long after the deal was announced.
  • John Kerry today expresses no patience for those who argue, as he himself once did, that Obama should have bombed Assad-regime sites in order to buttress America’s deterrent capability. “You’d still have the weapons there, and you’d probably be fighting isil” for control of the weapons, he said, referring to the Islamic State, the terror group also known as isis. “It just doesn’t make sense. But I can’t deny to you that this notion about the red line being crossed and [Obama’s] not doing anything gained a life of its own.”
  • today that decision is a source of deep satisfaction for him.
  • “I’m very proud of this moment,” he told me. “The overwhelming weight of conventional wisdom and the machinery of our national-security apparatus had gone fairly far. The perception was that my credibility was at stake, that America’s credibility was at stake. And so for me to press the pause button at that moment, I knew, would cost me politically. And the fact that I was able to pull back from the immediate pressures and think through in my own mind what was in America’s interest, not only with respect to Syria but also with respect to our democracy, was as tough a decision as I’ve made—and I believe ultimately it was the right decision to make.”
  • By 2013, Obama’s resentments were well developed. He resented military leaders who believed they could fix any problem if the commander in chief would simply give them what they wanted, and he resented the foreign-policy think-tank complex. A widely held sentiment inside the White House is that many of the most prominent foreign-policy think tanks in Washington are doing the bidding of their Arab and pro-Israel funders. I’ve heard one administration official refer to Massachusetts Avenue, the home of many of these think tanks, as “Arab-occupied territory.”
  • over the past few months, I’ve spent several hours talking with him about the broadest themes of his “long game” foreign policy, including the themes he is most eager to discuss—namely, the ones that have nothing to do with the Middle East.
  • I have come to believe that, in Obama’s mind, August 30, 2013, was his liberation day, the day he defied not only the foreign-policy establishment and its cruise-missile playbook, but also the demands of America’s frustrating, high-maintenance allies in the Middle East—countries, he complains privately to friends and advisers, that seek to exploit American “muscle” for their own narrow and sectarian ends.
  • “Where am I controversial? When it comes to the use of military power,” he said. “That is the source of the controversy. There’s a playbook in Washington that presidents are supposed to follow. It’s a playbook that comes out of the foreign-policy establishment. And the playbook prescribes responses to different events, and these responses tend to be militarized responses. Where America is directly threatened, the playbook works. But the playbook can also be a trap that can lead to bad decisions. In the midst of an international challenge like Syria, you get judged harshly if you don’t follow the playbook, even if there are good reasons why it does not apply.”
  • For some foreign-policy experts, even within his own administration, Obama’s about-face on enforcing the red line was a dispiriting moment in which he displayed irresolution and naïveté, and did lasting damage to America’s standing in the world. “Once the commander in chief draws that red line,” Leon Panetta, who served as CIA director and then as secretary of defense in Obama’s first term, told me recently, “then I think the credibility of the commander in chief and this nation is at stake if he doesn’t enforce it.” Right after Obama’s reversal, Hillary Clinton said privately, “If you say you’re going to strike, you have to strike. There’s no choice.”
  • Obama’s defenders, however, argue that he did no damage to U.S. credibility, citing Assad’s subsequent agreement to have his chemical weapons removed. “The threat of force was credible enough for them to give up their chemical weapons,” Tim Kaine, a Democratic senator from Virginia, told me. “We threatened military action and they responded. That’s deterrent credibility.”
  • History may record August 30, 2013, as the day Obama prevented the U.S. from entering yet another disastrous Muslim civil war, and the day he removed the threat of a chemical attack on Israel, Turkey, or Jordan. Or it could be remembered as the day he let the Middle East slip from America’s grasp, into the hands of Russia, Iran, and isis
  • spoke with obama about foreign policy when he was a U.S. senator, in 2006. At the time, I was familiar mainly with the text of a speech he had delivered four years earlier, at a Chicago antiwar rally. It was an unusual speech for an antiwar rally in that it was not antiwar; Obama, who was then an Illinois state senator, argued only against one specific and, at the time, still theoretical, war. “I suffer no illusions about Saddam Hussein,” he said. “He is a brutal man. A ruthless man … But I also know that Saddam poses no imminent and direct threat to the United States or to his neighbors.” He added, “I know that an invasion of Iraq without a clear rationale and without strong international support will only fan the flames of the Middle East, and encourage the worst, rather than best, impulses of the Arab world, and strengthen the recruitment arm of al-Qaeda.”
  • This speech had made me curious about its author. I wanted to learn how an Illinois state senator, a part-time law professor who spent his days traveling between Chicago and Springfield, had come to a more prescient understanding of the coming quagmire than the most experienced foreign-policy thinkers of his party, including such figures as Hillary Clinton, Joe Biden, and John Kerry, not to mention, of course, most Republicans and many foreign-policy analysts and writers, including me.
  • This was the moment the president believes he finally broke with what he calls, derisively, the “Washington playbook.”
  • “isis is not an existential threat to the United States,” he told me in one of these conversations. “Climate change is a potential existential threat to the entire world if we don’t do something about it.” Obama explained that climate change worries him in particular because “it is a political problem perfectly designed to repel government intervention. It involves every single country, and it is a comparatively slow-moving emergency, so there is always something seemingly more urgent on the agenda.”
  • At the moment, of course, the most urgent of the “seemingly more urgent” issues is Syria. But at any given moment, Obama’s entire presidency could be upended by North Korean aggression, or an assault by Russia on a member of nato, or an isis-planned attack on U.S. soil. Few presidents have faced such diverse tests on the international stage as Obama has, and the challenge for him, as for all presidents, has been to distinguish the merely urgent from the truly important, and to focus on the important.
  • My goal in our recent conversations was to see the world through Obama’s eyes, and to understand what he believes America’s role in the world should be. This article is informed by our recent series of conversations, which took place in the Oval Office; over lunch in his dining room; aboard Air Force One; and in Kuala Lumpur during his most recent visit to Asia, in November. It is also informed by my previous interviews with him and by his speeches and prolific public ruminations, as well as by conversations with his top foreign-policy and national-security advisers, foreign leaders and their ambassadors in Washington, friends of the president and others who have spoken with him about his policies and decisions, and his adversaries and critics.
  • Over the course of our conversations, I came to see Obama as a president who has grown steadily more fatalistic about the constraints on America’s ability to direct global events, even as he has, late in his presidency, accumulated a set of potentially historic foreign-policy achievements—controversial, provisional achievements, to be sure, but achievements nonetheless: the opening to Cuba, the Paris climate-change accord, the Trans-Pacific Partnership trade agreement, and, of course, the Iran nuclear deal.
  • These he accomplished despite his growing sense that larger forces—the riptide of tribal feeling in a world that should have already shed its atavism; the resilience of small men who rule large countries in ways contrary to their own best interests; the persistence of fear as a governing human emotion—frequently conspire against the best of America’s intentions. But he also has come to learn, he told me, that very little is accomplished in international affairs without U.S. leadership.
  • Obama talked me through this apparent contradiction. “I want a president who has the sense that you can’t fix everything,” he said. But on the other hand, “if we don’t set the agenda, it doesn’t happen.” He explained what he meant. “The fact is, there is not a summit I’ve attended since I’ve been president where we are not setting the agenda, where we are not responsible for the key results,” he said. “That’s true whether you’re talking about nuclear security, whether you’re talking about saving the world financial system, whether you’re talking about climate.”
  • One day, over lunch in the Oval Office dining room, I asked the president how he thought his foreign policy might be understood by historians. He started by describing for me a four-box grid representing the main schools of American foreign-policy thought. One box he called isolationism, which he dismissed out of hand. “The world is ever-shrinking,” he said. “Withdrawal is untenable.” The other boxes he labeled realism, liberal interventionism, and internationalism. “I suppose you could call me a realist in believing we can’t, at any given moment, relieve all the world’s misery,” he said. “We have to choose where we can make a real impact.” He also noted that he was quite obviously an internationalist, devoted as he is to strengthening multilateral organizations and international norms.
  • If a crisis, or a humanitarian catastrophe, does not meet his stringent standard for what constitutes a direct national-security threat, Obama said, he doesn’t believe that he should be forced into silence. He is not so much the realist, he suggested, that he won’t pass judgment on other leaders.
  • Though he has so far ruled out the use of direct American power to depose Assad, he was not wrong, he argued, to call on Assad to go. “Oftentimes when you get critics of our Syria policy, one of the things that they’ll point out is ‘You called for Assad to go, but you didn’t force him to go. You did not invade.’ And the notion is that if you weren’t going to overthrow the regime, you shouldn’t have said anything. That’s a weird argument to me, the notion that if we use our moral authority to say ‘This is a brutal regime, and this is not how a leader should treat his people,’ once you do that, you are obliged to invade the country and install a government you prefer.”
  • “I am very much the internationalist,” Obama said in a later conversation. “And I am also an idealist insofar as I believe that we should be promoting values, like democracy and human rights and norms and values
  • “Having said that,” he continued, “I also believe that the world is a tough, complicated, messy, mean place, and full of hardship and tragedy. And in order to advance both our security interests and those ideals and values that we care about, we’ve got to be hardheaded at the same time as we’re bighearted, and pick and choose our spots, and recognize that there are going to be times where the best that we can do is to shine a spotlight on something that’s terrible, but not believe that we can automatically solve it. There are going to be times where our security interests conflict with our concerns about human rights. There are going to be times where we can do something about innocent people being killed, but there are going to be times where we can’t.”
  • If Obama ever questioned whether America really is the world’s one indispensable nation, he no longer does so. But he is the rare president who seems at times to resent indispensability, rather than embrace it.
  • “Free riders aggravate me,” he told me. Recently, Obama warned that Great Britain would no longer be able to claim a “special relationship” with the United States if it did not commit to spending at least 2 percent of its GDP on defense. “You have to pay your fair share,” Obama told David Cameron, who subsequently met the 2 percent threshold.
  • Part of his mission as president, Obama explained, is to spur other countries to take action for themselves, rather than wait for the U.S. to lead. The defense of the liberal international order against jihadist terror, Russian adventurism, and Chinese bullying depends in part, he believes, on the willingness of other nations to share the burden with the U.S
  • This is why the controversy surrounding the assertion—made by an anonymous administration official to The New Yorker during the Libya crisis of 2011—that his policy consisted of “leading from behind” perturbed him. “We don’t have to always be the ones who are up front,” he told me. “Sometimes we’re going to get what we want precisely because we are sharing in the agenda.
  • The president also seems to believe that sharing leadership with other countries is a way to check America’s more unruly impulses. “One of the reasons I am so focused on taking action multilaterally where our direct interests are not at stake is that multilateralism regulates hubris,”
  • He consistently invokes what he understands to be America’s past failures overseas as a means of checking American self-righteousness. “We have history,” he said. “We have history in Iran, we have history in Indonesia and Central America. So we have to be mindful of our history when we start talking about intervening, and understand the source of other people’s suspicions.”
  • In his efforts to off-load some of America’s foreign-policy responsibilities to its allies, Obama appears to be a classic retrenchment president in the manner of Dwight D. Eisenhower and Richard Nixon. Retrenchment, in this context, is defined as “pulling back, spending less, cutting risk, and shifting burdens to allies
  • One difference between Eisenhower and Nixon, on the one hand, and Obama, on the other, Sestanovich said, is that Obama “appears to have had a personal, ideological commitment to the idea that foreign policy had consumed too much of the nation’s attention and resources.”
  • But once he decides that a particular challenge represents a direct national-security threat, he has shown a willingness to act unilaterally. This is one of the larger ironies of the Obama presidency: He has relentlessly questioned the efficacy of force, but he has also become the most successful terrorist-hunter in the history of the presidency, one who will hand to his successor a set of tools an accomplished assassin would envy
  • “He applies different standards to direct threats to the U.S.,” Ben Rhodes says. “For instance, despite his misgivings about Syria, he has not had a second thought about drones.” Some critics argue he should have had a few second thoughts about what they see as the overuse of drones. But John Brennan, Obama’s CIA director, told me recently that he and the president “have similar views. One of them is that sometimes you have to take a life to save even more lives. We have a similar view of just-war theory. The president requires near-certainty of no collateral damage. But if he believes it is necessary to act, he doesn’t hesitate.”
  • Those who speak with Obama about jihadist thought say that he possesses a no-illusions understanding of the forces that drive apocalyptic violence among radical Muslims, but he has been careful about articulating that publicly, out of concern that he will exacerbate anti-Muslim xenophobia
  • He has a tragic realist’s understanding of sin, cowardice, and corruption, and a Hobbesian appreciation of how fear shapes human behavior. And yet he consistently, and with apparent sincerity, professes optimism that the world is bending toward justice. He is, in a way, a Hobbesian optimist.
  • The contradictions do not end there. Though he has a reputation for prudence, he has also been eager to question some of the long-standing assumptions undergirding traditional U.S. foreign-policy thinking. To a remarkable degree, he is willing to question why America’s enemies are its enemies, or why some of its friends are its friends.
  • It is assumed, at least among his critics, that Obama sought the Iran deal because he has a vision of a historic American-Persian rapprochement. But his desire for the nuclear agreement was born of pessimism as much as it was of optimism. “The Iran deal was never primarily about trying to open a new era of relations between the U.S. and Iran,” Susan Rice told me. “It was far more pragmatic and minimalist. The aim was very simply to make a dangerous country substantially less dangerous. No one had any expectation that Iran would be a more benign actor.”
  • once mentioned to obama a scene from The Godfather: Part III, in which Michael Corleone complains angrily about his failure to escape the grasp of organized crime. I told Obama that the Middle East is to his presidency what the Mob is to Corleone, and I started to quote the Al Pacino line: “Just when I thought I was out—”“It pulls you back in,” Obama said, completing the thought
  • When I asked Obama recently what he had hoped to accomplish with his Cairo reset speech, he said that he had been trying—unsuccessfully, he acknowledged—to persuade Muslims to more closely examine the roots of their unhappiness.“My argument was this: Let’s all stop pretending that the cause of the Middle East’s problems is Israel,” he told me. “We want to work to help achieve statehood and dignity for the Palestinians, but I was hoping that my speech could trigger a discussion, could create space for Muslims to address the real problems they are confronting—problems of governance, and the fact that some currents of Islam have not gone through a reformation that would help people adapt their religious doctrines to modernity. My thought was, I would communicate that the U.S. is not standing in the way of this progress, that we would help, in whatever way possible, to advance the goals of a practical, successful Arab agenda that provided a better life for ordinary people.”
  • But over the next three years, as the Arab Spring gave up its early promise, and brutality and dysfunction overwhelmed the Middle East, the president grew disillusioned. Some of his deepest disappointments concern Middle Eastern leaders themselves. Benjamin Netanyahu is in his own category: Obama has long believed that Netanyahu could bring about a two-state solution that would protect Israel’s status as a Jewish-majority democracy, but is too fearful and politically paralyzed to do so
  • Obama has also not had much patience for Netanyahu and other Middle Eastern leaders who question his understanding of the region. In one of Netanyahu’s meetings with the president, the Israeli prime minister launched into something of a lecture about the dangers of the brutal region in which he lives, and Obama felt that Netanyahu was behaving in a condescending fashion, and was also avoiding the subject at hand: peace negotiations. Finally, the president interrupted the prime minister: “Bibi, you have to understand something,” he said. “I’m the African American son of a single mother, and I live here, in this house. I live in the White House. I managed to get elected president of the United States. You think I don’t understand what you’re talking about, but I do.”
  • Other leaders also frustrate him immensely. Early on, Obama saw Recep Tayyip Erdoğan, the president of Turkey, as the sort of moderate Muslim leader who would bridge the divide between East and West—but Obama now considers him a failure and an authoritarian, one who refuses to use his enormous army to bring stability to Syria
  • In recent days, the president has taken to joking privately, “All I need in the Middle East is a few smart autocrats.” Obama has always had a fondness for pragmatic, emotionally contained technocrats, telling aides, “If only everyone could be like the Scandinavians, this would all be easy.”
  • The unraveling of the Arab Spring darkened the president’s view of what the U.S. could achieve in the Middle East, and made him realize how much the chaos there was distracting from other priorities. “The president recognized during the course of the Arab Spring that the Middle East was consuming us,”
  • But what sealed Obama’s fatalistic view was the failure of his administration’s intervention in Libya, in 2011
  • Obama says today of the intervention, “It didn’t work.” The U.S., he believes, planned the Libya operation carefully—and yet the country is still a disaster.
  • “So we actually executed this plan as well as I could have expected: We got a UN mandate, we built a coalition, it cost us $1 billion—which, when it comes to military operations, is very cheap. We averted large-scale civilian casualties, we prevented what almost surely would have been a prolonged and bloody civil conflict. And despite all that, Libya is a mess.”
  • Mess is the president’s diplomatic term; privately, he calls Libya a “shit show,” in part because it’s subsequently become an isis haven—one that he has already targeted with air strikes. It became a shit show, Obama believes, for reasons that had less to do with American incompetence than with the passivity of America’s allies and with the obdurate power of tribalism.
  • Of France, he said, “Sarkozy wanted to trumpet the flights he was taking in the air campaign, despite the fact that we had wiped out all the air defenses and essentially set up the entire infrastructure” for the intervention. This sort of bragging was fine, Obama said, because it allowed the U.S. to “purchase France’s involvement in a way that made it less expensive for us and less risky for us.” In other words, giving France extra credit in exchange for less risk and cost to the United States was a useful trade-off—except that “from the perspective of a lot of the folks in the foreign-policy establishment, well, that was terrible. If we’re going to do something, obviously we’ve got to be up front, and nobody else is sharing in the spotlight.”
  • Obama also blamed internal Libyan dynamics. “The degree of tribal division in Libya was greater than our analysts had expected. And our ability to have any kind of structure there that we could interact with and start training and start providing resources broke down very quickly.”
  • Libya proved to him that the Middle East was best avoided. “There is no way we should commit to governing the Middle East and North Africa,” he recently told a former colleague from the Senate. “That would be a basic, fundamental mistake.”
  • Obama did not come into office preoccupied by the Middle East. He is the first child of the Pacific to become president—born in Hawaii, raised there and, for four years, in Indonesia—and he is fixated on turning America’s attention to Asia
  • For Obama, Asia represents the future. Africa and Latin America, in his view, deserve far more U.S. attention than they receive. Europe, about which he is unromantic, is a source of global stability that requires, to his occasional annoyance, American hand-holding. And the Middle East is a region to be avoided—one that, thanks to America’s energy revolution, will soon be of negligible relevance to the U.S. economy.
  • Advisers recall that Obama would cite a pivotal moment in The Dark Knight, the 2008 Batman movie, to help explain not only how he understood the role of isis, but how he understood the larger ecosystem in which it grew. “There’s a scene in the beginning in which the gang leaders of Gotham are meeting,” the president would say. “These are men who had the city divided up. They were thugs, but there was a kind of order. Everyone had his turf. And then the Joker comes in and lights the whole city on fire. isil is the Joker. It has the capacity to set the whole region on fire. That’s why we have to fight it.”
  • The rise of the Islamic State deepened Obama’s conviction that the Middle East could not be fixed—not on his watch, and not for a generation to come.
  • The traveling White House press corps was unrelenting: “Isn’t it time for your strategy to change?” one reporter asked. This was followed by “Could I ask you to address your critics who say that your reluctance to enter another Middle East war, and your preference of diplomacy over using the military, makes the United States weaker and emboldens our enemies?” And then came this imperishable question, from a CNN reporter: “If you’ll forgive the language—why can’t we take out these bastards?” Which was followed by “Do you think you really understand this enemy well enough to defeat them and to protect the homeland?”
  • This rhetoric appeared to frustrate Obama immensely. “When I hear folks say that, well, maybe we should just admit the Christians but not the Muslims; when I hear political leaders suggesting that there would be a religious test for which person who’s fleeing from a war-torn country is admitted,” Obama told the assembled reporters, “that’s not American. That’s not who we are. We don’t have religious tests to our compassion.”
  • he has never believed that terrorism poses a threat to America commensurate with the fear it generates. Even during the period in 2014 when isis was executing its American captives in Syria, his emotions were in check. Valerie Jarrett, Obama’s closest adviser, told him people were worried that the group would soon take its beheading campaign to the U.S. “They’re not coming here to chop our heads off,” he reassured her.
  • Obama frequently reminds his staff that terrorism takes far fewer lives in America than handguns, car accidents, and falls in bathtubs do
  • Several years ago, he expressed to me his admiration for Israelis’ “resilience” in the face of constant terrorism, and it is clear that he would like to see resilience replace panic in American society. Nevertheless, his advisers are fighting a constant rearguard action to keep Obama from placing terrorism in what he considers its “proper” perspective, out of concern that he will seem insensitive to the fears of the American people.
  • When I noted to Kerry that the president’s rhetoric doesn’t match his, he said, “President Obama sees all of this, but he doesn’t gin it up into this kind of—he thinks we are on track. He has escalated his efforts. But he’s not trying to create hysteria … I think the president is always inclined to try to keep things on an appropriate equilibrium. I respect that.”
  • Obama modulates his discussion of terrorism for several reasons: He is, by nature, Spockian. And he believes that a misplaced word, or a frightened look, or an ill-considered hyperbolic claim, could tip the country into panic. The sort of panic he worries about most is the type that would manifest itself in anti-Muslim xenophobia or in a challenge to American openness and to the constitutional order.
  • The president also gets frustrated that terrorism keeps swamping his larger agenda, particularly as it relates to rebalancing America’s global priorities. For years, the “pivot to Asia” has been a paramount priority of his. America’s economic future lies in Asia, he believes, and the challenge posed by China’s rise requires constant attention. From his earliest days in office, Obama has been focused on rebuilding the sometimes-threadbare ties between the U.S. and its Asian treaty partners, and he is perpetually on the hunt for opportunities to draw other Asian nations into the U.S. orbit. His dramatic opening to Burma was one such opportunity; Vietnam and the entire constellation of Southeast Asian countries fearful of Chinese domination presented others.
  • Obama believes, Carter said, that Asia “is the part of the world of greatest consequence to the American future, and that no president can take his eye off of this.” He added, “He consistently asks, even in the midst of everything else that’s going on, ‘Where are we in the Asia-Pacific rebalance? Where are we in terms of resources?’ He’s been extremely consistent about that, even in times of Middle East tension.”
  • “Right now, I don’t think that anybody can be feeling good about the situation in the Middle East,” he said. “You have countries that are failing to provide prosperity and opportunity for their people. You’ve got a violent, extremist ideology, or ideologies, that are turbocharged through social media. You’ve got countries that have very few civic traditions, so that as autocratic regimes start fraying, the only organizing principles are sectarian.”
  • He went on, “Contrast that with Southeast Asia, which still has huge problems—enormous poverty, corruption—but is filled with striving, ambitious, energetic people who are every single day scratching and clawing to build businesses and get education and find jobs and build infrastructure. The contrast is pretty stark.”
  • In Asia, as well as in Latin America and Africa, Obama says, he sees young people yearning for self-improvement, modernity, education, and material wealth.“They are not thinking about how to kill Americans,” he says. “What they’re thinking about is How do I get a better education? How do I create something of value?”
  • He then made an observation that I came to realize was representative of his bleakest, most visceral understanding of the Middle East today—not the sort of understanding that a White House still oriented around themes of hope and change might choose to advertise. “If we’re not talking to them,” he said, referring to young Asians and Africans and Latin Americans, “because the only thing we’re doing is figuring out how to destroy or cordon off or control the malicious, nihilistic, violent parts of humanity, then we’re missing the boat.
  • He does resist refracting radical Islam through the “clash of civilizations” prism popularized by the late political scientist Samuel Huntington. But this is because, he and his advisers argue, he does not want to enlarge the ranks of the enemy. “The goal is not to force a Huntington template onto this conflict,” said John Brennan, the CIA director.
  • “It is very clear what I mean,” he told me, “which is that there is a violent, radical, fanatical, nihilistic interpretation of Islam by a faction—a tiny faction—within the Muslim community that is our enemy, and that has to be defeated.”
  • “There is also the need for Islam as a whole to challenge that interpretation of Islam, to isolate it, and to undergo a vigorous discussion within their community about how Islam works as part of a peaceful, modern society,” he said. But he added, “I do not persuade peaceful, tolerant Muslims to engage in that debate if I’m not sensitive to their concern that they are being tagged with a broad brush.”
  • In private encounters with other world leaders, Obama has argued that there will be no comprehensive solution to Islamist terrorism until Islam reconciles itself to modernity and undergoes some of the reforms that have changed Christianity.
  • , Obama described how he has watched Indonesia gradually move from a relaxed, syncretistic Islam to a more fundamentalist, unforgiving interpretation; large numbers of Indonesian women, he observed, have now adopted the hijab, the Muslim head covering.
  • Why, Turnbull asked, was this happening?Because, Obama answered, the Saudis and other Gulf Arabs have funneled money, and large numbers of imams and teachers, into the country. In the 1990s, the Saudis heavily funded Wahhabist madrassas, seminaries that teach the fundamentalist version of Islam favored by the Saudi ruling family, Obama told Turnbull. Today, Islam in Indonesia is much more Arab in orientation than it was when he lived there, he said.
  • “Aren’t the Saudis your friends?,” Turnbull asked.Obama smiled. “It’s complicated,” he said.
  • But he went on to say that the Saudis need to “share” the Middle East with their Iranian foes. “The competition between the Saudis and the Iranians—which has helped to feed proxy wars and chaos in Syria and Iraq and Yemen—requires us to say to our friends as well as to the Iranians that they need to find an effective way to share the neighborhood and institute some sort of cold peace,”
  • “An approach that said to our friends ‘You are right, Iran is the source of all problems, and we will support you in dealing with Iran’ would essentially mean that as these sectarian conflicts continue to rage and our Gulf partners, our traditional friends, do not have the ability to put out the flames on their own or decisively win on their own, and would mean that we have to start coming in and using our military power to settle scores. And that would be in the interest neither of the United States nor of the Middle East.”
  • One of the most destructive forces in the Middle East, Obama believes, is tribalism—a force no president can neutralize. Tribalism, made manifest in the reversion to sect, creed, clan, and village by the desperate citizens of failing states, is the source of much of the Muslim Middle East’s problems, and it is another source of his fatalism. Obama has deep respect for the destructive resilience of tribalism—part of his memoir, Dreams From My Father, concerns the way in which tribalism in post-colonial Kenya helped ruin his father’s life—which goes some distance in explaining why he is so fastidious about avoiding entanglements in tribal conflicts.
  • “It is literally in my DNA to be suspicious of tribalism,” he told me. “I understand the tribal impulse, and acknowledge the power of tribal division. I’ve been navigating tribal divisions my whole life. In the end, it’s the source of a lot of destructive acts.”
  • “Look, I am not of the view that human beings are inherently evil,” he said. “I believe that there’s more good than bad in humanity. And if you look at the trajectory of history, I am optimistic.
  • “I believe that overall, humanity has become less violent, more tolerant, healthier, better fed, more empathetic, more able to manage difference. But it’s hugely uneven. And what has been clear throughout the 20th and 21st centuries is that the progress we make in social order and taming our baser impulses and steadying our fears can be reversed very quickly. Social order starts breaking down if people are under profound stress. Then the default position is tribe—us/them, a hostility toward the unfamiliar or the unknown.”
  • He continued, “Right now, across the globe, you’re seeing places that are undergoing severe stress because of globalization, because of the collision of cultures brought about by the Internet and social media, because of scarcities—some of which will be attributable to climate change over the next several decades—because of population growth. And in those places, the Middle East being Exhibit A, the default position for a lot of folks is to organize tightly in the tribe and to push back or strike out against those who are different.
  • “A group like isil is the distillation of every worst impulse along these lines. The notion that we are a small group that defines ourselves primarily by the degree to which we can kill others who are not like us, and attempting to impose a rigid orthodoxy that produces nothing, that celebrates nothing, that really is contrary to every bit of human progress—it indicates the degree to which that kind of mentality can still take root and gain adherents in the 21st century.”
  • “We have to determine the best tools to roll back those kinds of attitudes,” he said. “There are going to be times where either because it’s not a direct threat to us or because we just don’t have the tools in our toolkit to have a huge impact that, tragically, we have to refrain from jumping in with both feet.”
  • I asked Obama whether he would have sent the Marines to Rwanda in 1994 to stop the genocide as it was happening, had he been president at the time. “Given the speed with which the killing took place, and how long it takes to crank up the machinery of the U.S. government, I understand why we did not act fast enough,” he said. “Now, we should learn from tha
  • I actually think that Rwanda is an interesting test case because it’s possible—not guaranteed, but it’s possible—that this was a situation where the quick application of force might have been enough.
  • “Ironically, it’s probably easier to make an argument that a relatively small force inserted quickly with international support would have resulted in averting genocide [more successfully in Rwanda] than in Syria right now, where the degree to which the various groups are armed and hardened fighters and are supported by a whole host of external actors with a lot of resources requires a much larger commitment of forces.”
  • The Turkey press conference, I told him, “was a moment for you as a politician to say, ‘Yeah, I hate the bastards too, and by the way, I am taking out the bastards.’ ” The easy thing to do would have been to reassure Americans in visceral terms that he will kill the people who want to kill them. Does he fear a knee-jerk reaction in the direction of another Middle East invasion? Or is he just inalterably Spockian?
  • “Every president has strengths and weaknesses,” he answered. “And there is no doubt that there are times where I have not been attentive enough to feelings and emotions and politics in communicating what we’re doing and how we’re doing it.”
  • But for America to be successful in leading the world, he continued, “I believe that we have to avoid being simplistic. I think we have to build resilience and make sure that our political debates are grounded in reality. It’s not that I don’t appreciate the value of theater in political communications; it’s that the habits we—the media, politicians—have gotten into, and how we talk about these issues, are so detached so often from what we need to be doing that for me to satisfy the cable news hype-fest would lead to us making worse and worse decisions over time.”
  • “During the couple of months in which everybody was sure Ebola was going to destroy the Earth and there was 24/7 coverage of Ebola, if I had fed the panic or in any way strayed from ‘Here are the facts, here’s what needs to be done, here’s how we’re handling it, the likelihood of you getting Ebola is very slim, and here’s what we need to do both domestically and overseas to stamp out this epidemic,’ ” then “maybe people would have said ‘Obama is taking this as seriously as he needs to be.’ ” But feeding the panic by overreacting could have shut down travel to and from three African countries that were already cripplingly poor, in ways that might have destroyed their economies—which would likely have meant, among other things, a recurrence of Ebola. He added, “It would have also meant that we might have wasted a huge amount of resources in our public-health systems that need to be devoted to flu vaccinations and other things that actually kill people” in large numbers in America
  • “I have friends who have kids in Paris right now,” he said. “And you and I and a whole bunch of people who are writing about what happened in Paris have strolled along the same streets where people were gunned down. And it’s right to feel fearful. And it’s important for us not to ever get complacent. There’s a difference between resilience and complacency.” He went on to describe another difference—between making considered decisions and making rash, emotional ones. “What it means, actually, is that you care so much that you want to get it right and you’re not going to indulge in either impetuous or, in some cases, manufactured responses that make good sound bites but don’t produce results. The stakes are too high to play those games.”
  • The other meeting took place two months later, in the Oval Office, between Obama and the general secretary of the Vietnamese Communist Party, Nguyen Phu Trong. This meeting took place only because John Kerry had pushed the White House to violate protocol, since the general secretary was not a head of state. But the goals trumped decorum: Obama wanted to lobby the Vietnamese on the Trans-Pacific Partnership—his negotiators soon extracted a promise from the Vietnamese that they would legalize independent labor unions—and he wanted to deepen cooperation on strategic issues. Administration officials have repeatedly hinted to me that Vietnam may one day soon host a permanent U.S. military presence, to check the ambitions of the country it now fears most, China. The U.S. Navy’s return to Cam Ranh Bay would count as one of the more improbable developments in recent American history. “We just moved the Vietnamese Communist Party to recognize labor rights in a way that we could never do by bullying them or scaring them,” Obama told me, calling this a key victory in his campaign to replace stick-waving with diplomatic persuasion.
  • I noted that the 200 or so young Southeast Asians in the room earlier that day—including citizens of Communist-ruled countries—seemed to love America. “They do,” Obama said. “In Vietnam right now, America polls at 80 percent.”
  • The resurgent popularity of America throughout Southeast Asia means that “we can do really big, important stuff—which, by the way, then has ramifications across the board,” he said, “because when Malaysia joins the anti-isil campaign, that helps us leverage resources and credibility in our fight against terrorism. When we have strong relations with Indonesia, that helps us when we are going to Paris and trying to negotiate a climate treaty, where the temptation of a Russia or some of these other countries may be to skew the deal in a way that is unhelpful.
  • Obama then cited America’s increased influence in Latin America—increased, he said, in part by his removal of a region-wide stumbling block when he reestablished ties with Cuba—as proof that his deliberate, nonthreatening, diplomacy-centered approach to foreign relations is working. The alba movement, a group of Latin American governments oriented around anti-Americanism, has significantly weakened during his time as president. “When I came into office, at the first Summit of the Americas that I attended, Hugo Chávez”—the late anti-American Venezuelan dictator—“was still the dominant figure in the conversation,” he said. “We made a very strategic decision early on, which was, rather than blow him up as this 10-foot giant adversary, to right-size the problem and say, ‘We don’t like what’s going on in Venezuela, but it’s not a threat to the United States.’
  • Obama said that to achieve this rebalancing, the U.S. had to absorb the diatribes and insults of superannuated Castro manqués. “When I saw Chávez, I shook his hand and he handed me a Marxist critique of the U.S.–Latin America relationship,” Obama recalled. “And I had to sit there and listen to Ortega”—Daniel Ortega, the radical leftist president of Nicaragua—“make an hour-long rant against the United States. But us being there, not taking all that stuff seriously—because it really wasn’t a threat to us”—helped neutralize the region’s anti-Americanism.
  • “The truth is, actually, Putin, in all of our meetings, is scrupulously polite, very frank. Our meetings are very businesslike. He never keeps me waiting two hours like he does a bunch of these other folks.” Obama said that Putin believes his relationship with the U.S. is more important than Americans tend to think. “He’s constantly interested in being seen as our peer and as working with us, because he’s not completely stupid. He understands that Russia’s overall position in the world is significantly diminished. And the fact that he invades Crimea or is trying to prop up Assad doesn’t suddenly make him a player.
  • “The argument is made,” I said, “that Vladimir Putin watched you in Syria and thought, He’s too logical, he’s too rational, he’s too into retrenchment. I’m going to push him a little bit further in Ukraine.”
  • “Look, this theory is so easily disposed of that I’m always puzzled by how people make the argument. I don’t think anybody thought that George W. Bush was overly rational or cautious in his use of military force. And as I recall, because apparently nobody in this town does, Putin went into Georgia on Bush’s watch, right smack dab in the middle of us having over 100,000 troops deployed in Iraq.” Obama was referring to Putin’s 2008 invasion of Georgia, a former Soviet republic, which was undertaken for many of the same reasons Putin later invaded Ukraine—to keep an ex–Soviet republic in Russia’s sphere of influence.
  • “Putin acted in Ukraine in response to a client state that was about to slip out of his grasp. And he improvised in a way to hang on to his control there,” he said. “He’s done the exact same thing in Syria, at enormous cost to the well-being of his own country. And the notion that somehow Russia is in a stronger position now, in Syria or in Ukraine, than they were before they invaded Ukraine or before he had to deploy military forces to Syria is to fundamentally misunderstand the nature of power in foreign affairs or in the world generally. Real power means you can get what you want without having to exert violence. Russia was much more powerful when Ukraine looked like an independent country but was a kleptocracy that he could pull the strings on.”
  • Obama’s theory here is simple: Ukraine is a core Russian interest but not an American one, so Russia will always be able to maintain escalatory dominance there.“The fact is that Ukraine, which is a non-nato country, is going to be vulnerable to military domination by Russia no matter what we do,” he said.
  • “I think that the best argument you can make on the side of those who are critics of my foreign policy is that the president doesn’t exploit ambiguity enough. He doesn’t maybe react in ways that might cause people to think, Wow, this guy might be a little crazy.”“The ‘crazy Nixon’ approach,” I said: Confuse and frighten your enemies by making them think you’re capable of committing irrational acts.
  • “But let’s examine the Nixon theory,” he said. “So we dropped more ordnance on Cambodia and Laos than on Europe in World War II, and yet, ultimately, Nixon withdrew, Kissinger went to Paris, and all we left behind was chaos, slaughter, and authoritarian governments
  • “There is no evidence in modern American foreign policy that that’s how people respond. People respond based on what their imperatives are, and if it’s really important to somebody, and it’s not that important to us, they know that, and we know that,” he said. “There are ways to deter, but it requires you to be very clear ahead of time about what is worth going to war for and what is not.
  • Now, if there is somebody in this town that would claim that we would consider going to war with Russia over Crimea and eastern Ukraine, they should speak up and be very clear about it. The idea that talking tough or engaging in some military action that is tangential to that particular area is somehow going to influence the decision making of Russia or China is contrary to all the evidence we have seen over the last 50 years.”
  • “If you think about, let’s say, the Iran hostage crisis, there is a narrative that has been promoted today by some of the Republican candidates that the day Reagan was elected, because he looked tough, the Iranians decided, ‘We better turn over these hostages,’ ” he said. “In fact what had happened was that there was a long negotiation with the Iranians and because they so disliked Carter—even though the negotiations had been completed—they held those hostages until the day Reagan got elected
  • When you think of the military actions that Reagan took, you have Grenada—which is hard to argue helped our ability to shape world events, although it was good politics for him back home. You have the Iran-Contra affair, in which we supported right-wing paramilitaries and did nothing to enhance our image in Central America, and it wasn’t successful at all.” He reminded me that Reagan’s great foe, Daniel Ortega, is today the unrepentant president of Nicaragua.
  • Obama also cited Reagan’s decision to almost immediately pull U.S. forces from Lebanon after 241 servicemen were killed in a Hezbollah attack in 1983. “Apparently all these things really helped us gain credibility with the Russians and the Chinese,” because “that’s the narrative that is told,” he said sarcastically.
  • “Now, I actually think that Ronald Reagan had a great success in foreign policy, which was to recognize the opportunity that Gorbachev presented and to engage in extensive diplomacy—which was roundly criticized by some of the same people who now use Ronald Reagan to promote the notion that we should go around bombing people.”
  • “As I survey the next 20 years, climate change worries me profoundly because of the effects that it has on all the other problems that we face,” he said. “If you start seeing more severe drought; more significant famine; more displacement from the Indian subcontinent and coastal regions in Africa and Asia; the continuing problems of scarcity, refugees, poverty, disease—this makes every other problem we’ve got worse. That’s above and beyond just the existential issues of a planet that starts getting into a bad feedback loop.”
  • Terrorism, he said, is also a long-term problem “when combined with the problem of failed states.”
  • What country does he consider the greatest challenge to America in the coming decades? “In terms of traditional great-state relations, I do believe that the relationship between the United States and China is going to be the most critical,” he said. “If we get that right and China continues on a peaceful rise, then we have a partner that is growing in capability and sharing with us the burdens and responsibilities of maintaining an international order. If China fails; if it is not able to maintain a trajectory that satisfies its population and has to resort to nationalism as an organizing principle; if it feels so overwhelmed that it never takes on the responsibilities of a country its size in maintaining the international order; if it views the world only in terms of regional spheres of influence—then not only do we see the potential for conflict with China, but we will find ourselves having more difficulty dealing with these other challenges that are going to come.”
  • I’ve been very explicit in saying that we have more to fear from a weakened, threatened China than a successful, rising China,” Obama said. “I think we have to be firm where China’s actions are undermining international interests, and if you look at how we’ve operated in the South China Sea, we have been able to mobilize most of Asia to isolate China in ways that have surprised China, frankly, and have very much served our interest in strengthening our alliances.”
  • A weak, flailing Russia constitutes a threat as well, though not quite a top-tier threat. “Unlike China, they have demographic problems, economic structural problems, that would require not only vision but a generation to overcome,” Obama said. “The path that Putin is taking is not going to help them overcome those challenges. But in that environment, the temptation to project military force to show greatness is strong, and that’s what Putin’s inclination is. So I don’t underestimate the dangers there.”
  • “You know, the notion that diplomacy and technocrats and bureaucrats somehow are helping to keep America safe and secure, most people think, Eh, that’s nonsense. But it’s true. And by the way, it’s the element of American power that the rest of the world appreciates unambiguously
  • When we deploy troops, there’s always a sense on the part of other countries that, even where necessary, sovereignty is being violated.”
  • Administration officials have told me that Vice President Biden, too, has become frustrated with Kerry’s demands for action. He has said privately to the secretary of state, “John, remember Vietnam? Remember how that started?” At a National Security Council meeting held at the Pentagon in December, Obama announced that no one except the secretary of defense should bring him proposals for military action. Pentagon officials understood Obama’s announcement to be a brushback pitch directed at Kerry.
  • Obama’s caution on Syria has vexed those in the administration who have seen opportunities, at different moments over the past four years, to tilt the battlefield against Assad. Some thought that Putin’s decision to fight on behalf of Assad would prompt Obama to intensify American efforts to help anti-regime rebels. But Obama, at least as of this writing, would not be moved, in part because he believed that it was not his business to stop Russia from making what he thought was a terrible mistake. “They are overextended. They’re bleeding,” he told me. “And their economy has contracted for three years in a row, drastically.
  • Obama’s strategy was occasionally referred to as the “Tom Sawyer approach.” Obama’s view was that if Putin wanted to expend his regime’s resources by painting the fence in Syria, the U.S. should let him.
  • By late winter, though, when it appeared that Russia was making advances in its campaign to solidify Assad’s rule, the White House began discussing ways to deepen support for the rebels, though the president’s ambivalence about more-extensive engagement remained. In conversations I had with National Security Council officials over the past couple of months, I sensed a foreboding that an event—another San Bernardino–style attack, for instance—would compel the United States to take new and direct action in Syria. For Obama, this would be a nightmare.
  • If there had been no Iraq, no Afghanistan, and no Libya, Obama told me, he might be more apt to take risks in Syria. “A president does not make decisions in a vacuum. He does not have a blank slate. Any president who was thoughtful, I believe, would recognize that after over a decade of war, with obligations that are still to this day requiring great amounts of resources and attention in Afghanistan, with the experience of Iraq, with the strains that it’s placed on our military—any thoughtful president would hesitate about making a renewed commitment in the exact same region of the world with some of the exact same dynamics and the same probability of an unsatisfactory outcome.”
  • What has struck me is that, even as his secretary of state warns about a dire, Syria-fueled European apocalypse, Obama has not recategorized the country’s civil war as a top-tier security threat.
  • This critique frustrates the president. “Nobody remembers bin Laden anymore,” he says. “Nobody talks about me ordering 30,000 more troops into Afghanistan.” The red-line crisis, he said, “is the point of the inverted pyramid upon which all other theories rest.
  • “Was it a bluff?” I told him that few people now believe he actually would have attacked Iran to keep it from getting a nuclear weapon.“That’s interesting,” he said, noncommittally.I started to talk: “Do you—”He interrupted. “I actually would have,” he said, meaning that he would have struck Iran’s nuclear facilities. “If I saw them break out.”
  • “You were right to believe it,” the president said. And then he made his key point. “This was in the category of an American interest.”
  • I was reminded then of something Derek Chollet, a former National Security Council official, told me: “Obama is a gambler, not a bluffer.”
  • The president has placed some huge bets. Last May, as he was trying to move the Iran nuclear deal through Congress, I told him that the agreement was making me nervous. His response was telling. “Look, 20 years from now, I’m still going to be around, God willing. If Iran has a nuclear weapon, it’s my name on this,” he said. “I think it’s fair to say that in addition to our profound national-security interests, I have a personal interest in locking this down.”
  • In the matter of the Syrian regime and its Iranian and Russian sponsors, Obama has bet, and seems prepared to continue betting, that the price of direct U.S. action would be higher than the price of inaction. And he is sanguine enough to live with the perilous ambiguities of his decisions
  • Though in his Nobel Peace Prize speech in 2009, Obama said, “Inaction tears at our conscience and can lead to more costly intervention later,” today the opinions of humanitarian interventionists do not seem to move him, at least not publicly
  • As he comes to the end of his presidency, Obama believes he has done his country a large favor by keeping it out of the maelstrom—and he believes, I suspect, that historians will one day judge him wise for having done so
  • Inside the West Wing, officials say that Obama, as a president who inherited a financial crisis and two active wars from his predecessor, is keen to leave “a clean barn” to whoever succeeds him. This is why the fight against isis, a group he considers to be a direct, though not existential, threat to the U.S., is his most urgent priority for the remainder of his presidency; killing the so-called caliph of the Islamic State, Abu Bakr al-Baghdadi, is one of the top goals of the American national-security apparatus in Obama’s last year.
  • This is what is so controversial about the president’s approach, and what will be controversial for years to come—the standard he has used to define what, exactly, constitutes a direct threat.
  • Obama has come to a number of dovetailing conclusions about the world, and about America’s role in it. The first is that the Middle East is no longer terribly important to American interests. The second is that even if the Middle East were surpassingly important, there would still be little an American president could do to make it a better place. The third is that the innate American desire to fix the sorts of problems that manifest themselves most drastically in the Middle East inevitably leads to warfare, to the deaths of U.S. soldiers, and to the eventual hemorrhaging of U.S. credibility and power. The fourth is that the world cannot afford to see the diminishment of U.S. power. Just as the leaders of several American allies have found Obama’s leadership inadequate to the tasks before him, he himself has found world leadership wanting: global partners who often lack the vision and the will to spend political capital in pursuit of broad, progressive goals, and adversaries who are not, in his mind, as rational as he is. Obama believes that history has sides, and that America’s adversaries—and some of its putative allies—have situated themselves on the wrong one, a place where tribalism, fundamentalism, sectarianism, and militarism still flourish. What they don’t understand is that history is bending in his direction.
  • “The central argument is that by keeping America from immersing itself in the crises of the Middle East, the foreign-policy establishment believes that the president is precipitating our decline,” Ben Rhodes told me. “But the president himself takes the opposite view, which is that overextension in the Middle East will ultimately harm our economy, harm our ability to look for other opportunities and to deal with other challenges, and, most important, endanger the lives of American service members for reasons that are not in the direct American national-security interest.
  • George W. Bush was also a gambler, not a bluffer. He will be remembered harshly for the things he did in the Middle East. Barack Obama is gambling that he will be judged well for the things he didn’t do.
malonema1

GOP senator: Trump did not make 's---hole' comment | TheHill - 0 views

  • I’m telling you he did not use that word, George. And I’m telling you it’s a gross misrepresentation. How many times do you want me to say that?” Perdue said after host George Stephanopoulos pressed him for an answer.Perdue was one of several lawmakers participating in a meeting with Trump last week when the president reportedly referred to immigrants from African nations, El Salvador and Haiti as coming from "shithole countries."
  • Trump allies see 's***hole' controversy as overblownTrump allies see 's***hole' controversy as overblownPlay VideoPlayMute0:00/0:43Loaded: 0%0:00Progress: 0%Stream TypeLIVE-0:43 SharePlayback Rate1xChaptersChaptersDescriptionsdescriptions off, selectedCaptions
  • "Following comments by the President, I said my piece directly to him yesterday. The President and all those attending the meeting know what I said and how I feel. I've always believed that America is an idea, not defined by its people but by its ideals," Graham said. 
Javier E

The Wages of Guilt: Memories of War in Germany and Japan (Ian Buruma) - 0 views

  • the main reason why Germans were more trusted by their neighbors was that they were learning, slowly and painfully, and not always fully, to trust themselves.
  • elders, in government and the mass media, still voice opinions about the Japanese war that are unsettling, to say the least. Conservative politicians still pay their annual respects at a shrine where war criminals are officially remembered. Justifications and denials of war crimes are still heard. Too many Japanese in conspicuous places, including the prime minister’s office itself, have clearly not “coped” with the war.
  • unlike Nazi Germany, Japan had no systematic program to destroy the life of every man, woman, and child of a people that, for ideological reasons, was deemed to have no right to exist.
  • ...297 more annotations...
  • “We never knew,” a common reaction in the 1950s, had worn shamefully thin in the eyes of a younger generation by the 1960s. The extraordinary criminality of a deliberate genocide was so obvious that it left no room for argument.
  • Right-wing nationalists like to cite the absence of a Japanese Holocaust as proof that Japanese have no reason to feel remorse about their war at all. It was, in their eyes, a war like any other; brutal, yes, just as wars fought by all great nations in history have been brutal. In fact, since the Pacific War was fought against Western imperialists, it was a justified—even noble—war of Asian liberation.
  • in the late 1940s or 1950s, a time when most Germans were still trying hard not to remember. It is in fact extraordinary how honestly Japanese novelists and filmmakers dealt with the horrors of militarism in those early postwar years. Such honesty is much less evident now.
  • Popular comic books, aimed at the young, extol the heroics of Japanese soldiers and kamikaze pilots, while the Chinese and their Western allies are depicted as treacherous and belligerent. In 2008, the chief of staff of the Japanese Air Self-Defense Force stated that Japan had been “tricked” into the war by China and the US. In 2013, Prime Minister Abe Shinzo publicly doubted whether Japan’s military aggression in China could even be called an invasion.
  • The fact is that Japan is still haunted by historical issues that should have been settled decades ago. The reasons are political rather than cultural, and have to do with the pacifist constitution—written by American jurists in 1946—and with the imperial institution, absolved of war guilt by General Douglas MacArthur after the war for the sake of expediency.
  • Japan, even under Allied occupation, continued to be governed by much the same bureaucratic and political elite, albeit under a new, more democratic constitution,
  • a number of conservatives felt humiliated by what they rightly saw as an infringement of their national sovereignty. Henceforth, to them, everything from the Allied Tokyo War Crimes Tribunal to the denunciations of Japan’s war record by left-wing teachers and intellectuals would be seen in this light.
  • The more “progressive” Japanese used the history of wartime atrocities as a warning against turning away from pacifism, the more defensive right-wing politicians and commentators became about the Japanese war.
  • Views of history, in other words, were politicized—and polarized—from the beginning.
  • To take the sting out of this confrontation between constitutional pacifists and revisionists, which had led to much political turmoil in the 1950s, mainstream conservatives made a deliberate attempt to distract people’s attention from war and politics by concentrating on economic growth.
  • For several decades, the chauvinistic right wing, with its reactionary views on everything from high school education to the emperor’s status, was kept in check by the sometimes equally dogmatic Japanese left. Marxism was the prevailing ideology of the teachers union and academics.
  • the influence of Marxism waned after the collapse of the Soviet empire in the early 1990s, and the brutal records of Chairman Mao and Pol Pot became widely known.
  • Marginalized in the de facto one-party LDP state and discredited by its own dogmatism, the Japanese left did not just wane, it collapsed. This gave a great boost to the war-justifying right-wing nationalists,
  • Japanese young, perhaps out of boredom with nothing but materialistic goals, perhaps out of frustration with being made to feel guilty, perhaps out of sheer ignorance, or most probably out of a combination of all three, are not unreceptive to these patriotic blandishments.
  • Anxiety about the rise of China, whose rulers have a habit of using Japan’s historical crimes as a form of political blackmail, has boosted a prickly national pride, even at the expense of facing the truth about the past.
  • By 1996, the LDP was back in power, the constitutional issue had not been resolved, and historical debates continue to be loaded with political ideology. In fact, they are not really debates at all, but exercises in propaganda, tilted toward the reactionary side.
  • My instinct—call it a prejudice, if you prefer—before embarking on this venture was that people from distinct cultures still react quite similarly to similar circumstances.
  • The Japanese and the Germans, on the whole, did not behave in the same ways—but then the circumstances, both wartime and postwar, were quite different in the two Germanies and Japan. They still are.
  • Our comic-book prejudices turned into an attitude of moral outrage. This made life easier in a way. It was comforting to know that a border divided us from a nation that personified evil. They were bad, so we must be good. To grow up after the war in a country that had suffered German occupation was to know that one was on the side of the angels.
  • The question that obsessed us was not how we would have acquitted ourselves in uniform, going over the top, running into machine-gun fire or mustard gas, but whether we would have joined the resistance, whether we would have cracked under torture, whether we would have hidden Jews and risked deportation ourselves. Our particular shadow was not war, but occupation.
  • the frightened man who betrayed to save his life, who looked the other way, who grasped the wrong horn of a hideous moral dilemma, interested me more than the hero. This is no doubt partly because I fear I would be much like that frightened man myself. And partly because, to me, failure is more typical of the human condition than heroism.
  • I was curious to learn how Japanese saw the war, how they remembered it, what they imagined it to have been like, how they saw themselves in view of their past. What I heard and read was often surprising to a European:
  • this led me to the related subject of modern Japanese nationalism. I became fascinated by the writings of various emperor worshippers, historical revisionists, and romantic seekers after the unique essence of Japaneseness.
  • Bataan, the sacking of Manila, the massacres in Singapore, these were barely mentioned. But the suffering of the Japanese, in China, Manchuria, the Philippines, and especially in Hiroshima and Nagasaki, was remembered vividly, as was the imprisonment of Japanese soldiers in Siberia after the war. The Japanese have two days of remembrance: August 6, when Hiroshima was bombed, and August 15, the date of the Japanese surrender.
  • The curious thing was that much of what attracted Japanese to Germany before the war—Prussian authoritarianism, romantic nationalism, pseudo-scientific racialism—had lingered in Japan while becoming distinctly unfashionable in Germany. Why?
  • the two peoples saw their own purported virtues reflected in each other: the warrior spirit, racial purity, self-sacrifice, discipline, and so on. After the war, West Germans tried hard to discard this image of themselves. This was less true of the Japanese.
  • Which meant that any residual feelings of nostalgia for the old partnership in Japan were likely to be met with embarrassment in Germany.
  • I have concentrated on the war against the Jews in the case of Germany, since it was that parallel war, rather than, say, the U-boat battles in the Atlantic, or even the battle of Stalingrad, that left the most sensitive scar on the collective memory of (West) Germany.
  • I have emphasized the war in China and the bombing of Hiroshima, for these episodes, more than others, have lodged themselves, often in highly symbolic ways, in Japanese public life.
  • Do Germans perhaps have more reason to mourn? Is it because Japan has an Asian “shame culture,” to quote Ruth Benedict’s phrase, and Germany a Christian “guilt culture”?
  • why the collective German memory should appear to be so different from the Japanese. Is it cultural? Is it political? Is the explanation to be found in postwar history, or in the history of the war itself?
  • the two peoples still have anything in common after the war, it is a residual distrust of themselves.
  • when Michael sees thousands of German peace demonstrators, he does not see thousands of gentle people who have learned their lesson from the past; he sees “100 percent German Protestant rigorism, aggressive, intolerant, hard.”
  • To be betroffen implies a sense of guilt, a sense of shame, or even embarrassment. To be betroffen is to be speechless. But it also implies an idea of moral purity. To be betroffen is one way to “master the past,” to show contriteness, to confess, and to be absolved and purified.
  • In their famous book, written in the sixties, entitled The Inability to Mourn, Alexander and Margarethe Mitscherlich analyzed the moral anesthesia that afflicted postwar Germans who would not face their past. They were numbed by defeat; their memories appeared to be blocked. They would or could not do their labor, and confess. They appeared to have completely forgotten that they had glorified a leader who caused the death of millions.
  • There is something religious about the act of being betroffen, something close to Pietism,
  • heart of Pietism was the moral renovation of the individual, achieved by passing through the anguish of contrition into the overwhelming realization of the assurance of God’s grace.” Pietism served as an antidote to the secular and rational ideas of the French Enlightenment.
  • It began in the seventeenth century with the works of Philipp Jakob Spener. He wanted to reform the Church and bring the Gospel into daily life, as it were, by stressing good works and individual spiritual labor.
  • German television is rich in earnest discussion programs where people sit at round tables and debate the issues of the day. The audience sits at smaller tables, sipping drinks as the featured guests hold forth. The tone is generally serious, but sometimes the arguments get heated. It is easy to laugh at the solemnity of these programs, but there is much to admire about them. It is partly through these talk shows that a large number of Germans have become accustomed to political debate.
  • There was a real dilemma: at least two generations had been educated to renounce war and never again to send German soldiers to the front, educated, in other words, to want Germany to be a larger version of Switzerland. But they had also been taught to feel responsible for the fate of Israel, and to be citizens of a Western nation, firmly embedded in a family of allied Western nations. The question was whether they really could be both.
  • the Gulf War showed that German pacifism could not be dismissed simply as anti-Americanism or a rebellion against Adenauer’s West.
  • the West German mistrust of East Germans—the East Germans whose soldiers still marched in goose step, whose petit bourgeois style smacked of the thirties, whose system of government, though built on a pedestal of antifascism, contained so many disturbing remnants of the Nazi past; the East Germans, in short, who had been living in “Asia.”
  • Michael, the Israeli, compared the encounter of Westerners (“Wessies”) with Easterners (“Ossies”) with the unveiling of the portrait of Dorian Gray: the Wessies saw their own image and they didn’t like what they saw.
  • he added: “I also happen to think Japanese and Germans are racists.”
  • Germany for its Nazi inheritance and its sellout to the United States. But now that Germany had been reunified, with its specters of “Auschwitz” and its additional hordes of narrow-minded Ossies, Adenauer was deemed to have been right after
  • The picture was of Kiel in 1945, a city in ruins. He saw me looking at it and said: “It’s true that whoever is being bombed is entitled to some sympathy from us.”
  • “My personal political philosophy and maybe even my political ambition has to do with an element of distrust for the people I represent, people whose parents and grandparents made Hitler and the persecution of the Jews possible.”
  • in the seventies he had tried to nullify verdicts given in Nazi courts—without success until well into the eighties. One of the problems was that the Nazi judiciary itself was never purged. This continuity was broken only by time.
  • To bury Germany in the bosom of its Western allies, such as NATO and the EC, was to bury the distrust of Germans. Or so it was hoped. As Europeans they could feel normal, Western, civilized. Germany; the old “land in the middle,” the Central European colossus, the power that fretted over its identity and was haunted by its past, had become a Western nation.
  • It is a miracle, really, how quickly the Germans in the Federal Republic became civilized. We are truly part of the West now. We have internalized democracy. But the Germans of the former GDR, they are still stuck in a premodern age. They are the ugly Germans, very much like the West Germans after the war, the people I grew up with. They are not yet civilized.”
  • “I like the Germans very much, but I think they are a dangerous people. I don’t know why—perhaps it is race, or culture, or history. Whatever. But we Japanese are the same: we swing from one extreme to the other. As peoples, we Japanese, like the Germans, have strong collective discipline. When our energies are channeled in the right direction, this is fine, but when they are misused, terrible things happen.”
  • to be put in the same category as the Japanese—even to be compared—bothered many Germans. (Again, unlike the Japanese, who made the comparison often.) Germans I met often stressed how different they were from the Japanese,
  • To some West Germans, now so “civilized,” so free, so individualistic, so, well, Western, the Japanese, with their group discipline, their deference to authority, their military attitude toward work, might appear too close for comfort to a self-image only just, and perhaps only barely, overcome.
  • To what extent the behavior of nations, like that of individual people, is determined by history, culture, or character is a question that exercises many Japanese, almost obsessively.
  • not much sign of betroffenheit on Japanese television during the Gulf War. Nor did one see retired generals explain tactics and strategy. Instead, there were experts from journalism and academe talking in a detached manner about a faraway war which was often presented as a cultural or religious conflict between West and Middle East. The history of Muslim-Christian-Jewish animosity was much discussed. And the American character was analyzed at length to understand the behavior of George Bush and General Schwarzkopf.
  • In the words of one Albrecht Fürst von Urach, a Nazi propagandist, Japanese emperor worship was “the most unique fusion in the world of state form, state consciousness, and religious fanaticism.” Fanaticism was, of course, a positive word in the Nazi lexicon.
  • the identity question nags in almost any discussion about Japan and the outside world. It
  • It was a respectable view, but also one founded on a national myth of betrayal. Japan, according to the myth, had become the unique moral nation of peace, betrayed by the victors who had sat in judgment of Japan’s war crimes; betrayed in Vietnam, in Afghanistan, in Nicaragua; betrayed by the arms race, betrayed by the Cold War; Japan had been victimized not only by the “gratuitous,” perhaps even “racist,” nuclear attacks on Hiroshima and Nagasaki, but by all subsequent military actions taken by the superpowers,
  • When the Prime Minister of Japan, Shidehara Kijuro, protested in 1946 to General MacArthur that it was all very well saying that Japan should assume moral leadership in renouncing war, but that in the real world no country would follow this example, MacArthur replied: “Even if no country follows you, Japan will lose nothing. It is those who do not support this who are in the wrong.” For a long time most Japanese continued to take this view.
  • What is so convenient in the cases of Germany and Japan is that pacifism happens to be a high-minded way to dull the pain of historical guilt. Or, conversely, if one wallows in it, pacifism turns national guilt into a virtue, almost a mark of superiority, when compared to the complacency of other nations.
  • The denial of historical discrimination is not just a way to evade guilt. It is intrinsic to pacifism. To even try to distinguish between wars, to accept that some wars are justified, is already an immoral position.
  • That Kamei discussed this common paranoia in such odd, Volkish terms could mean several things: that some of the worst European myths got stuck in Japan, that the history of the Holocaust had no impact, or that Japan is in some respects a deeply provincial place. I think all three explanations apply.
  • “the problem with the U.S.-Japan relationship is difficult. A racial problem, really. Yankees are friendly people, frank people. But, you know, it’s hard. You see, we have to be friendly …”
  • Like Oda, indeed like many people of the left, Kamei thought in racial terms. He used the word jinshu, literally race. He did not even use the more usual minzoku, which corresponds, in the parlance of Japanese right-wingers, to Volk, or the more neutral kokumin, meaning the citizens of a state.
  • many Germans in the liberal democratic West have tried to deal honestly with their nation’s terrible past, the Japanese, being different, have been unable to do so. It is true that the Japanese, compared with the West Germans, have paid less attention to the suffering they inflicted on others, and shown a greater inclination to shift the blame. And liberal democracy, whatever it may look like on paper, has not been the success in Japan that it was in the German Federal Republic. Cultural differences might account for this. But one can look at these matters in a different, more political way. In his book The War Against the West, published in London in 1938, the Hungarian scholar Aurel Kolnai followed the Greeks in his definition of the West: “For the ancient Greeks ‘the West’ (or ‘Europe’) meant society with a free constitution and self-government under recognized rules, where ‘law is king,’ whereas the ‘East’ (or ‘Asia’) signified theocratic societies under godlike rulers whom their subjects serve ‘like slaves.’
  • According to this definition, both Hitler’s Germany and prewar Japan were of the East.
  • There was a great irony here: in their zeal to make Japan part of the West, General MacArthur and his advisers made it impossible for Japan to do so in spirit. For a forced, impotent accomplice is not really an accomplice at all.
  • In recent years, Japan has often been called an economic giant and a political dwarf. But this has less to do with a traditional Japanese mentality—isolationism, pacifism, shyness with foreigners, or whatnot—than with the particular political circumstances after the war that the United States helped to create.
  • when the Cold War prompted the Americans to make the Japanese subvert their constitution by creating an army which was not supposed to exist, the worst of all worlds appeared: sovereignty was not restored, distrust remained, and resentment mounted.
  • Kamei’s hawks are angry with the Americans for emasculating Japan; Oda’s doves hate the Americans for emasculating the “peace constitution.” Both sides dislike being forced accomplices, and both feel victimized, which is one reason Japanese have a harder time than Germans in coming to terms with their wartime past.
  • As far as the war against the Jews is concerned, one might go back to 1933, when Hitler came to power. Or at the latest to 1935, when the race laws were promulgated in Nuremberg. Or perhaps those photographs of burning synagogues on the night of November 9, 1938, truly marked the first stage of the Holocaust.
  • There is the famous picture of German soldiers lifting the barrier on the Polish border in 1939, but was that really the beginning? Or did it actually start with the advance into the Rhineland in 1936, or was it the annexation of the Sudetenland, or Austria, or Czechoslovakia?
  • IT IS DIFFICULT TO SAY when the war actually began for the Germans and the Japanese. I cannot think of a single image that fixed the beginning of either war in the public mind.
  • Possibly to avoid these confusions, many Germans prefer to talk about the Hitlerzeit (Hitler era) instead of “the war.”
  • only Japanese of a liberal disposition call World War II the Pacific War. People who stick to the idea that Japan was fighting a war to liberate Asia from Bolshevism and white colonialism call it the Great East Asian War (Daitowa Senso), as in the Great East Asian Co-Prosperity Sphere.
  • The German equivalent, I suppose, would be the picture of Soviet soldiers raising their flag on the roof of the gutted Reichstag in Berlin.
  • People of this opinion separate the world war of 1941–45 from the war in China, which they still insist on calling the China Incident.
  • Liberals and leftists, on the other hand, tend to splice these wars together and call them the Fifteen-Year War (1931–45).
  • images marking the end are more obvious.
  • argued that the struggle against Western imperialism actually began in 1853, with the arrival in Japan of Commodore Perry’s ships, and spoke of the Hundred-Year War.
  • These are among the great clichés of postwar Japan: shorthand for national defeat, suffering, and humiliation.
  • The Germans called it Zusammenbruch (the collapse) or Stunde Null (Zero Hour): everything seemed to have come to an end, everything had to start all over. The Japanese called it haisen (defeat) or shusen (termination of the war).
  • kokka (nation, state) and minzoku (race, people) are not quite of the same order as Sonderbehandlung (special treatment) or Einsatzgruppe (special action squad). The jargon of Japanese imperialism was racist and overblown, but it did not carry the stench of death camps.
  • The German people are spiritually starved, Adenauer told him. “The imagination has to be provided for.” This was no simple matter, especially in the German language, which had been so thoroughly infected by the jargon of mass murder.
  • All they had been told to believe in, the Germans and the Japanese, everything from the Führerprinzip to the emperor cult, from the samurai spirit to the Herrenvolk, from Lebensraum to the whole world under one (Japanese) roof, all that lay in ruins
  • How to purge this language from what a famous German philologist called the Lingua Tertii Imperii? “… the language is no longer lived,” wrote George Steiner in 1958, “it is merely spoken.”
  • out of defeat and ruin a new school of literature (and cinema) did arise. It is known in Germany as Trümmerliteratur (literature of the ruins). Japanese writers who came of age among the ruins called themselves the yakeato seidai (burnt-out generation). Much literature of the late forties and fifties was darkened by nihilism and despair.
  • It was as though Germany—Sonderweg or no Sonderweg—needed only to be purged of Nazism, while Japan’s entire cultural tradition had to be overhauled.
  • In Germany there was a tradition to fall back on. In the Soviet sector, the left-wing culture of the Weimar Republic was actively revived. In the Western sectors, writers escaped the rats and the ruins by dreaming of Goethe. His name was often invoked to prove that Germany, too, belonged to the humanist, enlightened strain of European civilization.
  • the Americans (and many Japanese leftists) distrusted anything associated with “feudalism,” which they took to include much of Japan’s premodern past. Feudalism was the enemy of democracy. So not only did the American censors, in their effort to teach the Japanese democracy, forbid sword-fight films and samurai dramas, but at one point ninety-eight Kabuki plays were banned too.
  • yet, what is remarkable about much of the literature of the period, or more precisely, of the literature about that time, since much of it was written later, is the deep strain of romanticism, even nostalgia. This colors personal memories of people who grew up just after the war as well.
  • If the mushroom cloud and the imperial radio speech are the clichés of defeat, the scene of an American soldier (usually black) raping a Japanese girl (always young, always innocent), usually in a pristine rice field (innocent, pastoral Japan), is a stock image in postwar movies about the occupation.
  • To Ango, then, as to other writers, the ruins offered hope. At last the Japanese, without “the fake kimono” of traditions and ideals, were reduced to basic human needs; at last they could feel real love, real pain; at last they would be honest. There was no room, among the ruins, for hypocrisy.
  • Böll was able to be precise about the end of the Zusammenbruch and the beginning of bourgeois hypocrisy and moral amnesia. It came on June 20, 1948, the day of the currency reform, the day that Ludwig Erhard, picked by the Americans as Economics Director in the U.S.-British occupation zone, gave birth to the Deutsche Mark. The DM, from then on, would be the new symbol of West German national pride;
  • the amnesia, and definitely the identification with the West, was helped further along by the Cold War. West Germany now found itself on the same side as the Western allies. Their common enemy was the “Asiatic” Soviet empire. Fewer questions needed to be asked.
  • Indeed, to some people the Cold War simply confirmed what they had known all along: Germany always had been on the right side, if only our American friends had realized it earlier.
  • The process of willed forgetfulness culminated in the manic effort of reconstruction, in the great rush to prosperity.
  • “Prosperity for All” was probably the best that could have happened to the Germans of the Federal Republic. It took the seed of resentment (and thus future extremism) out of defeat. And the integration of West Germany into a Western alliance was a good thing too.
  • The “inability to mourn,” the German disassociation from the piles of corpses strewn all over Central and Eastern Europe, so that the Third Reich, as the Mitscherlichs put it, “faded like a dream,” made it easier to identify with the Americans, the victors, the West.
  • Yet the disgust felt by Böll and others for a people getting fat (“flabby” is the usual term, denoting sloth and decadence) and forgetting about its murderous past was understandable.
  • The Brückners were the price Germany had to pay for the revival of its fortunes. Indeed, they were often instrumental in it. They were the apparatchik who functioned in any system, the small, efficient fish who voted for Christian conservatives in the West and became Communists in the East.
  • Staudte was clearly troubled by this, as were many Germans, but he offered no easy answers. Perhaps it was better this way: flabby democrats do less harm than vengeful old Nazis.
  • the forgetful, prosperous, capitalist Federal Republic of Germany was in many more or less hidden ways a continuation of Hitler’s Reich. This perfectly suited the propagandists of the GDR, who would produce from time to time lists of names of former Nazis who were prospering in the West. These lists were often surprisingly accurate.
  • In a famous film, half fiction, half documentary, made by a number of German writers and filmmakers (including Böll) in 1977, the continuity was made explicit. The film, called Germany in Autumn (Deutschland in Herbst),
  • Rainer Werner Fassbinder was one of the participants in this film. A year later he made The Marriage of Maria Braun.
  • To lifelong “antifascists” who had always believed that the Federal Republic was the heir to Nazi Germany, unification seemed—so they said—almost like a restoration of 1933. The irony was that many Wessies saw their new Eastern compatriots as embarrassing reminders of the same unfortunate past.
  • Rarely was the word “Auschwitz” heard more often than during the time of unification, partly as an always salutary reminder that Germans must not forget, but partly as an expression of pique that the illusion of a better, antifascist, anticapitalist, idealistic Germany, born in the ruins of 1945, and continued catastrophically for forty years in the East, had now been dashed forever.
  • Ludwig Erhard’s almost exact counterpart in Japan was Ikeda Hayato, Minister of Finance from 1949 and Prime Minister from 1960 to 1964. His version of Erhard’s “Prosperity for AH” was the Double Your Incomes policy, which promised to make the Japanese twice as rich in ten years. Japan had an average growth rate of 11 percent during the 1960s.
  • It explains, at any rate, why the unification of the two Germanys was considered a defeat by antifascists on both sides of the former border.
  • Very few wartime bureaucrats had been purged. Most ministries remained intact. Instead it was the Communists, who had welcomed the Americans as liberators, who were purged after 1949, the year China was “lost.”
  • so the time of ruins was seen by people on the left as a time of missed chances and betrayal. Far from achieving a pacifist utopia of popular solidarity, they ended up with a country driven by materialism, conservatism, and selective historical amnesia.
  • the “red purges” of 1949 and 1950 and the return to power of men whose democratic credentials were not much better helped to turn many potential Japanese friends of the United States into enemies. For the Americans were seen as promoters of the right-wing revival and the crackdown on the left.
  • For exactly twelve years Germany was in the hands of a criminal regime, a bunch of political gangsters who had started a movement. Removing this regime was half the battle.
  • It is easier to change political institutions and hope that habits and prejudices will follow. This, however, was more easily done in Germany than in Japan.
  • There had not been a cultural break either in Japan. There were no exiled writers and artists who could return to haunt the consciences of those who had stayed.
  • There was no Japanese Thomas Mann or Alfred Döblin. In Japan, everyone had stayed.
  • In Japan there was never a clear break between a fascist and a prefascist past. In fact, Japan was never really a fascist state at all. There was no fascist or National Socialist ruling party, and no Führer either. The closest thing to it would have been the emperor, and whatever else he may have been, he was not a fascist dictator.
  • whereas after the war Germany lost its Nazi leaders, Japan lost only its admirals and generals.
  • Japan was effectively occupied only by the Americans. West Germany was part of NATO and the European Community, and the GDR was in the Soviet empire. Japan’s only formal alliance is with the United States, through a security treaty that many Japanese have opposed.
  • But the systematic subservience of Japan meant that the country never really grew up. There is a Japanese fixation on America, an obsession which goes deeper, I believe, than German anti-Americanism,
  • Yet nothing had stayed entirely the same in Japan. The trouble was that virtually all the changes were made on American orders. This was, of course, the victor’s prerogative, and many changes were beneficial.
  • like in fiction. American Hijiki, a novella by Nosaka Akiyuki, is, to my mind, a masterpiece in the short history of Japanese Trümmerliteratur.
  • Older Japanese do, however, remember the occupation, the first foreign army occupation in their national history. But it was, for the Japanese, a very unusual army. Whereas the Japanese armies in Asia had brought little but death, rape, and destruction, this one came with Glenn Miller music, chewing gum, and lessons in democracy. These blessings left a legacy of gratitude, rivalry, and shame.
  • did these films teach the Japanese democracy? Oshima thinks not. Instead, he believes, Japan learned the values of “progress” and “development.” Japan wanted to be just as rich as America—no, even richer:
  • think it is a romantic assumption, based less on history than on myth; a religious notion, expressed less through scholarship than through monuments, memorials, and historical sites turned into sacred grounds.
  • The past, wrote the West German historian Christian Meier, is in our bones. “For a nation to appropriate its history,” he argued, “is to look at it through the eyes of identity.” What we have “internalized,” he concluded, is Auschwitz.
  • Auschwitz is such a place, a sacred symbol of identity for Jews, Poles, and perhaps even Germans. The question is what or whom Germans are supposed to identify with.
  • The idea that visiting the relics of history brings the past closer is usually an illusion. The opposite is more often true.
  • To visit the site of suffering, any description of which cannot adequately express the horror, is upsetting, not because one gets closer to knowing what it was actually like to be a victim, but because such visits stir up emotions one cannot trust. It is tempting to take on the warm moral glow of identification—so easily done and so presumptuous—with the victims:
  • Were the crimes of Auschwitz, then, part of the German “identity”? Was genocide a product of some ghastly flaw in German culture, the key to which might be found in the sentimental proverbs, the cruel fairy tales, the tight leather shorts?
  • yet the imagination is the only way to identify with the past. Only in the imagination—not through statistics, documents, or even photographs—do people come alive as individuals, do stories emerge, instead of History.
  • nature. It is all right to let the witnesses speak, in the courtroom, in the museums, on videotape (Claude Lanzmann’s Shoah has been shown many times on German television), but it is not all right for German artists to use their imagination.
  • the reluctance in German fiction to look Auschwitz in the face, the almost universal refusal to deal with the Final Solution outside the shrine, the museum, or the schoolroom, suggests a fear of committing sacrilege.
  • beneath the fear of bad taste or sacrilege may lie a deeper problem. To imagine people in the past as people of flesh and blood, not as hammy devils in silk capes, is to humanize them. To humanize is not necessarily to excuse or to sympathize, but it does demolish the barriers of abstraction between us and them. We could, under certain circumstances, have been them.
  • the flight into religious abstraction was to be all too common among Germans of the Nazi generation, as well as their children; not, as is so often the case with Jews, to lend mystique to a new identity, as a patriotic Zionist, but on the contrary to escape from being the heir to a peculiarly German crime, to get away from having to “internalize” Auschwitz, or indeed from being German at all.
  • a Hollywood soap opera, a work of skillful pop, which penetrated the German imagination in a way nothing had before. Holocaust was first shown in Germany in January 1979. It was seen by 20 million people, about half the adult population of the Federal Republic; 58 percent wanted it to be repeated; 12,000 letters, telegrams, and postcards were sent to the broadcasting stations; 5,200 called the stations by telephone after the first showing; 72.5 percent were positive, 7.3 percent negative.
  • “After Holocaust,” wrote a West German woman to her local television station, “I feel deep contempt for those beasts of the Third Reich. I am twenty-nine years old and a mother of three children. When I think of the many mothers and children sent to the gas chambers, I have to cry. (Even today the Jews are not left in peace. We Germans have the duty to work every day for peace in Israel.) I bow to the victims of the Nazis, and I am ashamed to be a German.”
  • Auschwitz was a German crime, to be sure. “Death is a master from Germany.” But it was a different Germany. To insist on viewing history through the “eyes of identity,” to repeat the historian Christian Meier’s phrase, is to resist the idea of change.
  • Is there no alternative to these opposing views? I believe there is.
  • The novelist Martin Walser, who was a child during the war, believes, like Meier, that Auschwitz binds the German people, as does the language of Goethe. When a Frenchman or an American sees pictures of Auschwitz, “he doesn’t have to think: We human beings! He can think: Those Germans! Can we think: Those Nazis! I for one cannot …”
  • Adorno, a German Jew who wished to save high German culture, on whose legacy the Nazis left their bloody finger marks, resisted the idea that Auschwitz was a German crime. To him it was a matter of modern pathology, the sickness of the “authoritarian personality,” of the dehumanized SS guards, those inhumane cogs in a vast industrial wheel.
  • To the majority of Japanese, Hiroshima is the supreme symbol of the Pacific War. All the suffering of the Japanese people is encapsulated in that almost sacred word: Hiroshima. But it is more than a symbol of national martyrdom; Hiroshima is a symbol of absolute evil, often compared to Auschwitz.
  • has the atmosphere of a religious center. It has martyrs, but no single god. It has prayers, and it has a ready-made myth about the fall of man. Hiroshima, says a booklet entitled Hiroshima Peace Reader, published by the Hiroshima Peace Culture Foundation, “is no longer merely a Japanese city. It has become recognized throughout the world as a Mecca of world peace.”
  • They were not enshrined in the Japanese park, and later attempts by local Koreans to have the monument moved into Peace Park failed. There could only be one cenotaph, said the Hiroshima municipal authorities. And the cenotaph did not include Koreans.
  • What is interesting about Hiroshima—the Mecca rather than the modern Japanese city, which is prosperous and rather dull—is the tension between its universal aspirations and its status as the exclusive site of Japanese victimhood.
  • it is an opinion widely held by Japanese nationalists. The right always has been concerned with the debilitating effects on the Japanese identity of war guilt imposed by American propaganda.
  • The Japanese, in contrast, were duped by the Americans into believing that the traces of Japanese suffering should be swept away by the immediate reconstruction of Hiroshima. As a result, the postwar Japanese lack an identity and their racial virility has been sapped by American propaganda about Japanese war guilt.
  • Hiroshima, Uno wrote, should have been left as it was, in ruins, just as Auschwitz, so he claims, was deliberately preserved by the Jews. By reminding the world of their martyrdom, he said, the Jews have kept their racial identity intact and restored their virility.
  • But the idea that the bomb was a racist experiment is less plausible, since the bomb was developed for use against Nazi Germany.
  • There is another view, however, held by leftists and liberals, who would not dream of defending the “Fifteen-Year War.” In this view, the A-bomb was a kind of divine punishment for Japanese militarism. And having learned their lesson through this unique suffering, having been purified through hellfire and purgatory, so to speak, the Japanese people have earned the right, indeed have the sacred duty, to sit in judgment of others, specifically the United States, whenever they show signs of sinning against the “Hiroshima spirit.”
  • The left has its own variation of Japanese martyrdom, in which Hiroshima plays a central role. It is widely believed, for instance, that countless Japanese civilians fell victim to either a wicked military experiment or to the first strike in the Cold War, or both.
  • However, right-wing nationalists care less about Hiroshima than about the idée fixe that the “Great East Asian War” was to a large extent justified.
  • This is at the heart of what is known as Peace Education, which has been much encouraged by the leftist Japan Teachers’ Union and has been regarded with suspicion by the conservative government. Peace Education has traditionally meant pacifism, anti-Americanism, and a strong sympathy for Communist states, especially China.
  • The A-bomb, in this version, was dropped to scare the Soviets away from invading Japan. This at least is an arguable position.
  • left-wing pacifism in Japan has something in common with the romantic nationalism usually associated with the right: it shares the right’s resentment about being robbed by the Americans of what might be called a collective memory.
  • The romantic pacifists believe that the United States, to hide its own guilt and to rekindle Japanese militarism in aid of the Cold War, tried to wipe out the memory of Hiroshima.
  • few events in World War II have been described, analyzed, lamented, reenacted, re-created, depicted, and exhibited so much and so often as the bombing of Hiroshima
  • The problem with Nagasaki was not just that Hiroshima came first but also that Nagasaki had more military targets than Hiroshima. The Mitsubishi factories in Nagasaki produced the bulk of Japanese armaments. There was also something else, which is not often mentioned: the Nagasaki bomb exploded right over the area where outcasts and Christians lived. And unlike in Hiroshima, much of the rest of the city was spared the worst.
  • yet, despite these diatribes, the myth of Hiroshima and its pacifist cult is based less on American wickedness than on the image of martyred innocence and visions of the apocalypse.
  • The comparison between Hiroshima and Auschwitz is based on this notion; the idea, namely, that Hiroshima, like the Holocaust, was not part of the war, not even connected with it, but “something that occurs at the end of the world
  • still I wonder whether it is really so different from the position of many Germans who wish to “internalize” Auschwitz, who see Auschwitz “through the eyes of identity.”
  • the Japanese to take two routes at once, a national one, as unique victims of the A-bomb, and a universal one, as the apostles of the Hiroshima spirit. This, then, is how Japanese pacifists, engaged in Peace Education, define the Japanese identity.
  • the case for Hiroshima is at least open to debate. The A-bomb might have saved lives; it might have shortened the war. But such arguments are incompatible with the Hiroshima spirit.
  • In either case, nationality has come to be based less on citizenship than on history, morality, and a religious spirit.
  • The problem with this quasi-religious view of history is that it makes it hard to discuss past events in anything but nonsecular terms. Visions of absolute evil are unique, and they are beyond human explanation or even comprehension. To explain is hubristic and amoral.
  • in the history of Japan’s foreign wars, the city of Hiroshima is far from innocent. When Japan went to war with China in 1894, the troops set off for the battlefronts from Hiroshima, and the Meiji emperor moved his headquarters there. The city grew wealthy as a result. It grew even wealthier when Japan went to war with Russia eleven years later, and Hiroshima once again became the center of military operations. As the Hiroshima Peace Reader puts it with admirable conciseness, “Hiroshima, secure in its position as a military city, became more populous and prosperous as wars and incidents occurred throughout the Meiji and Taisho periods.” At the time of the bombing, Hiroshima was the base of the Second General Headquarters of the Imperial Army (the First was in Tokyo). In short, the city was swarming with soldiers. One of the few literary masterpieces to emerge
  • when a local group of peace activists petitioned the city of Hiroshima in 1987 to incorporate the history of Japanese aggression into the Peace Memorial Museum, the request was turned down. The petition for an “Aggressors’ Corner” was prompted by junior high school students from Osaka, who had embarrassed Peace Museum officials by asking for an explanation about Japanese responsibility for the war.
  • Yukoku Ishinkai (Society for Lament and National Restoration), thought the bombing had saved Japan from total destruction. But he insisted that Japan could not be held solely responsible for the war. The war, he said, had simply been part of the “flow of history.”
  • They also demanded an official recognition of the fact that some of the Korean victims of the bomb had been slave laborers. (Osaka, like Kyoto and Hiroshima, still has a large Korean population.) Both requests were denied. So a group called Peace Link was formed, from local people, many of whom were Christians, antinuclear activists, or involved with discriminated-against minorities.
  • The history of the war, or indeed any history, is indeed not what the Hiroshima spirit is about. This is why Auschwitz is the only comparison that is officially condoned. Anything else is too controversial, too much part of the “flow of history.”
  • “You see, this museum was not really intended to be a museum. It was built by survivors as a place of prayer for the victims and for world peace. Mankind must build a better world. That is why Hiroshima must persist. We must go back to the basic roots. We must think of human solidarity and world peace. Otherwise we just end up arguing about history.”
  • Only when a young Japanese history professor named Yoshimi Yoshiaki dug up a report in American archives in the 1980s did it become known that the Japanese had stored 15,000 tons of chemical weapons on and near the island and that a 200-kilogram container of mustard gas was buried under Hiroshima.
  • what was the largest toxic gas factory in the Japanese Empire. More than 5,000 people worked there during the war, many of them women and schoolchildren. About 1,600 died of exposure to hydrocyanic acid gas, nausea gas, and lewisite. Some were damaged for life. Official Chinese sources claim that more than 80,000 Chinese fell victim to gases produced at the factory. The army was so secretive about the place that the island simply disappeared from Japanese maps.
  • in 1988, through the efforts of survivors, the small museum was built, “to pass on,” in the words of the museum guide, “the historical truth to future generations.”
  • Surviving workers from the factory, many of whom suffered from chronic lung diseases, asked for official recognition of their plight in the 1950s. But the government turned them down. If the government had compensated the workers, it would have been an official admission that the Japanese Army had engaged in an illegal enterprise. When a brief mention of chemical warfare crept into Japanese school textbooks, the Ministry of Education swiftly took it out.
  • I asked him about the purpose of the museum. He said: “Before shouting ‘no more war,’ I want people to see what it was really like. To simply look at the past from the point of view of the victim is to encourage hatred.”
  • “Look,” he said, “when you fight another man, and hit him and kick him, he will hit and kick back. One side will win. How will this be remembered? Do we recall that we were kicked, or that we started the kicking ourselves? Without considering this question, we cannot have peace.”
  • The fact that Japanese had buried poison gas under Hiroshima did not lessen the horror of the A-bomb. But it put Peace Park, with all its shrines, in a more historical perspective. It took the past away from God and put it in the fallible hands of man.
  • What did he think of the Peace Museum in Hiroshima? “At the Hiroshima museum it is easy to feel victimized,” he said. “But we must realize that we were aggressors too. We were educated to fight for our country. We made toxic gas for our country. We lived to fight the war. To win the war was our only goal.”
  • Nanking, as the capital of the Nationalist government, was the greatest prize in the attempted conquest of China. Its fall was greeted in Japan with banner headlines and nationwide celebration. For six weeks Japanese Army officers allowed their men to run amok. The figures are imprecise, but tens of thousands, perhaps hundreds of thousands (the Chinese say 300,000) of Chinese soldiers and civilians, many of them refugees from other towns, were killed. And thousands of women between the ages of about nine and seventy-five were raped, mutilated, and often murdered.
  • Was it a deliberate policy to terrorize the Chinese into submission? The complicity of the officers suggests there was something to this. But it might also have been a kind of payoff to the Japanese troops for slogging through China in the freezing winter without decent pay or rations. Or was it largely a matter of a peasant army running out of control? Or just the inevitable consequence of war, as many Japanese maintain?
  • inevitable cruelty of war. An atrocity is a willful act of criminal brutality, an act that violates the law as well as any code of human decency. It isn’t that the Japanese lack such codes or are morally incapable of grasping the concept. But “atrocity,” like “human rights,” is part of a modern terminology which came from the West, along with “feminism,” say, or “war crimes.” To right-wing nationalists it has a leftist ring, something subversive, something almost anti-Japanese.
  • During the Tokyo War Crimes Tribunal, Nanking had the same resonance as Auschwitz had in Nuremberg. And being a symbol, the Nanking Massacre is as vulnerable to mythology and manipulation as Auschwitz and Hiroshima.
  • Mori’s attitude also raises doubts about Ruth Benedict’s distinction between Christian “guilt culture” and Confucian “shame culture.”
  • In her opinion, a “society that inculcates absolute standards of morality and relies on man’s developing a conscience is a guilt culture by definition …” But in “a culture where shame is a major sanction, people are chagrined about acts which we expect people to feel guilty about.” However, this “chagrin cannot be relieved, as guilt can be, by confession and atonement …”
  • memory was admitted at all, the Mitscherlichs wrote about Germans in the 1950s, “it was only in order to balance one’s own guilt against that of others. Many horrors had been unavoidable, it was claimed, because they had been dictated by crimes committed by the adversary.” This was precisely what many Japanese claimed, and still do claim. And it is why Mori insists on making his pupils view the past from the perspective of the aggressors.
  • Two young Japanese officers, Lieutenant N. and Lieutenant M., were on their way to Nanking and decided to test their swordsmanship: the first to cut off one hundred Chinese heads would be the winner. And thus they slashed their way through Chinese ranks, taking scalps in true samurai style. Lieutenant M. got 106, and Lieutenant N. bagged 105.
  • The story made a snappy headline in a major Tokyo newspaper: “Who Will Get There First! Two Lieutenants Already Claimed 80.” In the Nanking museum is a newspaper photograph of the two friends, glowing with youthful high spirits. Lieutenant N. boasted in the report that he had cut the necks off 56 men without even denting the blade of his ancestral sword.
  • I was told by a Japanese veteran who had fought in Nanking that such stories were commonly made up or at least exaggerated by Japanese reporters, who were ordered to entertain the home front with tales of heroism.
  • Honda Katsuichi, a famous Asahi Shimbun reporter, was told the story in Nanking. He wrote it up in a series of articles, later collected in a book entitled A Journey to China, published in 1981.
  • the whole thing developed into the Nankin Ronso, or Nanking Debate. In 1984, an anti-Honda book came out, by Tanaka Masaaki, entitled The Fabrication of the “Nanking Massacre.”
  • back in Japan, Lieutenant M. began to revise his story. Speaking at his old high school, he said that in fact he had beheaded only four or five men in actual combat. As for the rest … “After we occupied the city, I stood facing a ditch, and told the Chinese prisoners to step forward. Since Chinese soldiers are stupid, they shuffled over to the ditch, one by one, and I cleanly cut off their heads.”
  • The nationalist intellectuals are called goyo gakusha by their critics. It is a difficult term to translate, but the implied meaning is “official scholars,” who do the government’s bidding.
  • the debate on the Japanese war is conducted almost entirely outside Japanese universities, by journalists, amateur historians, political columnists, civil rights activists, and so forth. This means that the zanier theories of the likes of Tanaka…
  • The other reason was that modern history was not considered academically respectable. It was too fluid, too political, too controversial. Until 1955, there was not one modern historian on the staff of Tokyo University. History stopped around the middle of the nineteenth century. And even now, modern…
  • In any case, so the argument invariably ends, Hiroshima, having been planned in cold blood, was a far worse crime. “Unlike in Europe or China,” writes Tanaka, “you won’t find one instance of planned, systematic murder in the entire history of Japan.” This is because the Japanese…
  • One reason is that there are very few modern historians in Japan. Until the end of the war, it would have been dangerously subversive, even blasphemous, for a critical scholar to write about modern…
  • they have considerable influence on public opinion, as television commentators, lecturers, and contributors to popular magazines. Virtually none of them are professional historians.
  • Tanaka and others have pointed out that it is physically impossible for one man to cut off a hundred heads with one blade, and that for the same reason Japanese troops could never have…
  • Besides, wrote Tanaka, none of the Japanese newspapers reported any massacre at the time, so why did it suddenly come up…
  • He admits that a few innocent people got killed in the cross fire, but these deaths were incidental. Some soldiers were doubtless a bit rough, but…
  • even he defends an argument that all the apologists make too: “On the battlefield men face the ultimate extremes of human existence, life or death. Extreme conduct, although still ethically…
  • atrocities carried out far from the battlefield dangers and imperatives and according to a rational plan were acts of evil barbarism. The Auschwitz gas chambers of our ‘ally’ Germany and the atomic bombing of our…
  • The point that it was not systematic was made by leftist opponents of the official scholars too. The historian Ienaga Saburo, for example, wrote that the Nanking Massacre, whose scale and horror he does not deny, “may have been a reaction to the fierce Chinese resistance after the Shanghai fighting.” Ienaga’s…
  • The nationalist right takes the opposite view. To restore the true identity of Japan, the emperor must be reinstated as a religious head of state, and Article Nine must be revised to make Japan a legitimate military power again. For this reason, the Nanking Massacre, or any other example of extreme Japanese aggression, has to be ignored, softened, or denied.
  • the question remains whether the raping and killing of thousands of women, and the massacre of thousands, perhaps hundreds of thousands, of other unarmed people, in the course of six weeks, can still be called extreme conduct in the heat of battle. The question is pertinent, particularly when such extreme violence is justified by an ideology which teaches the aggressors that killing an inferior race is in accordance with the will of their divine emperor.
  • The politics behind the symbol are so divided and so deeply entrenched that it hinders a rational historical debate about what actually happened in 1937. The more one side insists on Japanese guilt, the more the other insists on denying it.
  • The Nanking Massacre, for leftists and many liberals too, is the main symbol of Japanese militarism, supported by the imperial (and imperialist) cult. Which is why it is a keystone of postwar pacifism. Article Nine of the constitution is necessary to avoid another Nanking Massacre.
  • The Japanese, he said, should see their history through their own eyes, for “if we rely on the information of aliens and alien countries, who use history for the sake of propaganda, then we are in danger of losing the sense of our own history.” Yet another variation of seeing history through the eyes of identity.
  • their emotions were often quite at odds with the idea of “shame culture” versus “guilt culture.” Even where the word for shame, hazukashii, was used, its meaning was impossible to distinguish from the Western notion of guilt.
  • wasn’t so bad in itself. But then they killed them. You see, rape was against military regulations, so we had to destroy the evidence. While the women were fucked, they were considered human, but when we killed them, they were just pigs. We felt no shame about it, no guilt. If we had, we couldn’t have done it.
  • “Whenever we would enter a village, the first thing we’d do was steal food, then we’d take the women and rape them, and finally we’d kill all the men, women, and children to make sure they couldn’t slip away and tell the Chinese troops where we were. Otherwise we wouldn’t have been able to sleep at night.”
  • Clearly, then, the Nanking Massacre had been the culmination of countless massacres on a smaller scale. But it had been mass murder without a genocidal ideology. It was barbaric, but to Azuma and his comrades, barbarism was part of war.
  • “Sexual desire is human,” he said. “Since I suffered from a venereal disease, I never actually did it with Chinese women. But I did peep at their private parts. We’d always order them to drop their trousers. They never wore any underwear, you know. But the others did it with any woman that crossed our path.
  • He did have friends, however, who took part in the killings. One of them, Masuda Rokusuke, killed five hundred men by the Yangtze River with his machine gun. Azuma visited his friend in the hospital just before he died in the late 1980s. Masuda was worried about going to hell. Azuma tried to reassure him that he was only following orders. But Masuda remained convinced that he was going to hell.
  • “One of the worst moments I can remember was the killing of an old man and his grandson. The child was bayoneted and the grandfather started to suck the boy’s blood, as though to conserve his grandson’s life a bit longer. We watched a while and then killed both. Again, I felt no guilt, but I was bothered by this kind of thing. I felt confused. So I decided to keep a diary. I thought it might help me think straight.”
  • What about his old comrades? I asked. How did they discuss the war? “Oh,” said Azuma, “we wouldn’t talk about it much. When we did, it was to justify it. The Chinese resisted us, so we had to do what we did, and so on. None of us felt any remorse. And I include myself.”
  • got more and more agitated. “They turned the emperor into a living god, a false idol, like the Ayatollah in Iran or like Kim II Sung. Because we believed in the divine emperor, we were prepared to do anything, anything at all, kill, rape, anything. But I know he fucked his wife every night, just like we do …” He paused and lowered his voice. “But you know we cannot say this in Japan, even today. It is impossible in this country to tell the truth.”
  • My first instinct was to applaud West German education. Things had come a long way since 1968. There had been no school classes at Nuremberg, or even at the Auschwitz trial in Frankfurt from 1963 till 1965. Good for the teacher, I thought. Let them hear what was done. But I began to have doubts.
  • Just as belief belongs in church, surely history education belongs in school. When the court of law is used for history lessons, then the risk of show trials cannot be far off. It may be that show trials can be good politics—though I have my doubts about this too. But good politics don’t necessarily serve the truth.
  • There is a story about the young Richard when he was in Nuremberg at the time of the war crimes trials. He is said to have turned to a friend and to have remarked, in his best Wehrmacht officer style, that they should storm the court and release the prisoners. The friend, rather astonished, asked why on earth they should do such a thing. “So that we can try them ourselves” was Weiszäcker’s alleged response.
  • There was also concern that international law might not apply to many of the alleged crimes. If revenge was the point, why drag the law into it? Why not take a political decision to punish? This was what Becker, in his office, called the Italian solution: “You kill as many people as you can in the first six weeks, and then you forget about it: not very legal, but for the purposes of purification, well …”
  • Becker was not against holding trials as such. But he believed that existing German laws should have been applied, instead of retroactive laws about crimes against peace (preparing, planning, or waging an aggressive war).
  • It was to avoid a travesty of the legal process that the British had been in favor of simply executing the Nazi leaders without a trial. The British were afraid that a long trial might change public opinion. The trial, in the words of one British diplomat, might be seen as a “put-up job.”
  • The question is how to achieve justice without distorting the law, and how to stage a trial by victors over the vanquished without distorting history. A possibility would have been to make victors’ justice explicit, by letting military courts try the former enemies.
  • This would have avoided much hypocrisy and done less damage to the due process of law in civilian life. But if the intention was to teach Germans a history lesson, a military court would have run into the same problems as a civilian one.
  • Due process or revenge. This problem had preoccupied the ancient Greek tragedians. To break the cycle of vendetta, Orestes had to be tried by the Athens court for the murder of his mother. Without a formal trial, the vengeful Furies would continue to haunt the living.
  • The aspect of revenge might have been avoided had the trial been held by German judges. There was a precedent for this, but it was not a happy one. German courts had been allowed to try alleged war criminals after World War I. Despite strong evidence against them, virtually all were acquitted, and the foreign delegates were abused by local mobs. Besides, Wetzka was right: German judges had collaborated with the Nazi regime; they could hardly be expected to be impartial. So it was left to the victors to see that justice was done.
  • When the American chief prosecutor in Nuremberg, Robert H. Jackson, was asked by the British judge, Lord Justice Lawrence, what he thought the purpose of the trials should be, Jackson answered that they were to prove to the world that the German conduct of the war had been unjustified and illegal, and to demonstrate to the German people that this conduct deserved severe punishment and to prepare them for
  • What becomes clear from this kind of language is that law, politics, and religion became confused: Nuremberg became a morality play, in which Göring, Kaltenbrunner, Keitel, and the others were cast in the leading roles. It was a play that claimed to deliver justice, truth, and the defeat of evil.
  • The Nuremberg trials were to be a history lesson, then, as well as a symbolic punishment of the German people—a moral history lesson cloaked in all the ceremonial trappings of due legal process. They were the closest that man, or at least the men belonging to the victorious powers, could come to dispensing divine justice. This was certainly the way some German writers felt about it. Some welcomed it
  • We now have this law on our books, the prosecutor said: “It will be used against the German aggressor this time. But the four powers, who are conducting this trial in the name of twenty-three nations, know this law and declare: Tomorrow we shall be judged before history by the same yardstick by which we judge these defendants today.”
  • “We had seen through the amorality of the Nazis, and wanted to rid ourselves of it. It was from the moral seriousness of the American prosecution that we wished to learn sensible political thinking. “And we did learn. “And we allowed ourselves to apply this thinking to the present time. For example, we will use it now to take quite literally the morality of those American prosecutors. Oradour and Lidice—today they are cities in South Vietnam” (Italics in the original text.)
  • The play ends with a statement by the American prosecutor on crimes against peace
  • (It was decided in 1979, after the shock of the Holocaust TV series, to abolish the statute of limitations for crimes against humanity.)
  • after Nuremberg, most Germans were tired of war crimes. And until the mid-1950s German courts were permitted to deal only with crimes committed by Germans against other Germans. It took the bracing example of the Eichmann trial in Jerusalem to jolt German complacency—that, and the fact that crimes committed before 1946 would no longer be subject to prosecution after 1965.
  • Trying the vanquished for conventional war crimes was never convincing, since the victors could be accused of the same. Tu quoque could be invoked, in private if not in the Nuremberg court, when memories of Dresden and Soviet atrocities were still fresh. But Auschwitz had no equivalent. That was part of another war, or, better, it was not really a war at all; it was mass murder pure and simple, not for reasons of strategy or tactics, but of ideology alone.
  • Whether you are a conservative who wants Germany to be a “normal” nation or a liberal/leftist engaging in the “labor of mourning,” the key event of World War II is Auschwitz, not the Blitzkrieg, not Dresden, not even the war on the eastern front. This was the one history lesson of Nuremberg that stuck. As Hellmut Becker said, despite his skepticism about Nuremberg: “It was most important that the German population realized that crimes against humanity had taken place and that during the trials it became clear how they had taken place.”
  • In his famous essay on German guilt, Die Schuldfrage (The Question of German Guilt), written in 1946, Karl Jaspers distinguished four categories of guilt: criminal guilt, for breaking the law; political guilt, for being part of a criminal political system; moral guilt, for personal acts of criminal behavior; and metaphysical guilt, for failing in one’s responsibility to maintain the standards of civilized humanity. Obviously these categories overlap.
  • The great advantage, in his view, of a war crimes trial was its limitation. By allowing the accused to defend themselves with arguments, by laying down the rules of due process, the victors limited their own powers.
  • In any event, the trial distanced the German people even further from their former leaders. It was a comfortable distance, and few people had any desire to bridge it. This might be why the Nazi leaders are hardly ever featured in German plays, films, or novels.
  • And: “For us Germans this trial has the advantage that it distinguishes between the particular crimes of the leaders and that it does not condemn the Germans collectively.”
  • Serious conservative intellectuals, such as Hermann Lübbe, argued that too many accusations would have blocked West Germany’s way to becoming a stable, prosperous society. Not that Lübbe was an apologist for the Third Reich. Far from it: the legitimacy of the Federal Republic, in his opinion, lay in its complete rejection of the Nazi state.
  • their reaction was often one of indignation. “Why me?” they would say. “I just did my duty. I just followed orders like every decent German. Why must I be punished?”
  • “that these criminals were so like all of us at any point between 1918 and 1945 that we were interchangeable, and that particular circumstances caused them to take a different course, which resulted in this trial, these matters could not be properly discussed in the courtroom.” The terrible acts of individuals are lifted from their historical context. History is reduced to criminal pathology and legal argument.
  • they will not do as history lessons, nor do they bring us closer to that elusive thing that Walser seeks, a German identity.
  • The GDR had its own ways of using courts of law to deal with the Nazi past. They were in many respects the opposite of West German ways. The targets tended to be the very people that West German justice had ignored.
  • Thorough purges took place in the judiciary, the bureaucracy, and industry. About 200,000 people—four-fifths of the Nazi judges and prosecutors—lost their jobs. War crimes trials were held too; until 1947 by the Soviets, after that in German courts.
  • There were two more before 1957, and none after that. All in all, about 30,000 people had been tried and 500 executed. In the Federal Republic the number was about 91,000, and none were executed, as the death penalty was abolished by the 1949 constitution.
  • East German methods were both ruthless and expedient, and the official conclusion to the process was that the GDR no longer had to bear the burden of guilt. As state propaganda ceaselessly pointed out, the guilty were all in the West. There the fascists still sat as judges and ran the industries that produced the economic boom, the Wirtschaftswunder.
  • society. Although some of his critics, mostly on the old left, in both former Germanys, called him a grand inquisitor, few doubted the pastor’s good intentions. His arguments for trials were moral, judicial, and historical. He set out his views in a book entitled The Stasi Documents. Echoes of an earlier past rang through almost every page. “We can
  • Germany of the guilty, the people who felt betroffen by their own “inability to mourn,” the nation that staged the Auschwitz and Majdanek trials, that Germany was now said to stand in judgment over the other Germany—the Germany of the old antifascists, the Germany that had suffered under two dictatorships, the Germany of uniformed marches, goose-stepping drills, and a secret police network, vast beyond even the Gestapo’s dreams.
  • It is almost a form of subversion to defend a person who stands accused in court. So the idea of holding political and military leaders legally accountable for their actions was even stranger in Japan than it was in Germany. And yet, the shadows thrown by the Tokyo trial have been longer and darker in Japan than those of the Nuremberg trial in Germany.
  • never was—unlike, say, the railway station or the government ministry—a central institution of the modern Japanese state. The law was not a means to protect the people from arbitrary rule; it was, rather, a way for the state to exercise more control over the people. Even today, there are relatively few lawyers in Japan.
  • Japanese school textbooks are the product of so many compromises that they hardly reflect any opinion at all. As with all controversial matters in Japan, the more painful, the less said. In a standard history textbook for middle school students, published in the 1980s, mention of the Tokyo trial takes up less than half a page. All it says is that the trial…
  • As long as the British and the Americans continued to be oppressors in Asia, wrote a revisionist historian named Hasegawa Michiko, who was born in 1945, “confrontation with Japan was inevitable. We did not fight for Japan alone. Our aim was to fight a Greater East Asia War. For this reason the war between Japan and China and Japan’s oppression of…
  • West German textbooks describe the Nuremberg trial in far more detail. And they make a clear distinction between the retroactive law on crimes against peace and the…
  • Nationalist revisionists talk about “the Tokyo Trial View of History,” as though the conclusions of the tribunal had been nothing but rabid anti-Japanese propaganda. The tribunal has been called a lynch mob, and Japanese leftists are blamed for undermining the morale of generations of Japanese by passing on the Tokyo Trial View of History in school textbooks and liberal publications. The Tokyo Trial…
  • When Hellmut Becker said that few Germans wished to criticize the procedures of the Nuremberg trial because the criminality of the defendants was so plain to see, he was talking about crimes against humanity—more precisely, about the Holocaust. And it was…
  • The knowledge compiled by the doctors of Unit 731—of freezing experiments, injection of deadly diseases, vivisections, among other things—was considered so valuable by the Americans in 1945 that the doctors…
  • those aspects of the war that were most revolting and furthest removed from actual combat, such as the medical experiments on human guinea pigs (known as “logs”) carried out by Unit 731 in…
  • There never were any Japanese war crimes trials, nor is there a Japanese Ludwigsburg. This is partly because there was no exact equivalent of the Holocaust. Even though the behavior of Japanese troops was often barbarous, and the psychological consequences of State Shinto and emperor worship were frequently as hysterical as Nazism, Japanese atrocities were part of a…
  • This difference between (West) German and Japanese textbooks is not just a matter of detail; it shows a gap in perception. To the Japanese, crimes against humanity are not associated with an equivalent to the…
  • on what grounds would Japanese courts have prosecuted their own former leaders? Hata’s answer: “For starting a war which they knew they would lose.” Hata used the example of General Galtieri and his colleagues in Argentina after losing the Falklands War. In short, they would have been tried for losing the war, and the intense suffering they inflicted on their own people. This is as though German courts in 1918 had put General Hindenburg or General Ludendorff on trial.
  • it shows yet again the fundamental difference between the Japanese war, in memory and, I should say, in fact, and the German experience. The Germans fought a war too, but the one for which they tried their own people, the Bogers and the Schwammbergers, was a war they could not lose, unless defeat meant that some of the enemies survived.
  • Just as German leftists did in the case of Nuremberg, Kobayashi used the trial to turn the tables against the judges. But not necessarily to mitigate Japanese guilt. Rather, it was his intention to show how the victors had betrayed the pacifism they themselves had imposed on Japan.
  • the Japanese left has a different view of the Tokyo trial than the revisionist right. It is comparable to the way the German left looks upon Nuremberg. This was perfectly, if somewhat long-windedly, expressed in Kobayashi Masaki’s documentary film Tokyo Trial, released in 1983. Kobayashi is anything but an apologist for the Japanese war. His most famous film, The Human Condition, released in 1959, took a highly critical view of the war.
  • Yoshimoto’s memory was both fair and devastating, for it pointed straight at the reason for the trial’s failure. The rigging of a political trial—the “absurd ritual”—undermined the value of that European idea of law.
  • Yoshimoto went on to say something no revisionist would ever mention: “I also remember my fresh sense of wonder at this first encounter with the European idea of law, which was so different from the summary justice in our Asiatic courts. Instead of getting your head chopped off without a proper trial, the accused were able to defend themselves, and the careful judgment appeared to follow a public procedure.”
  • Yoshimoto Takaaki, philosopher of the 1960s New Left. Yet he wrote in 1986 that “from our point of view as contemporaries and witnesses, the trial was partly plotted from the very start. It was an absurd ritual before slaughtering the sacrificial lamb.”
  • This, from all accounts, was the way it looked to most Japanese, even if they had little sympathy for most of the “lambs.” In 1948, after three years of American occupation censorship and boosterism, people listened to the radio broadcast of the verdicts with a sad but fatalist shrug: this is what you can expect when you lose the war.
  • Some of the information even surprised the defendants. General Itagaki Seishiro, a particularly ruthless figure, who was in command of prison camps in Southeast Asia and whose troops had massacred countless Chinese civilians, wrote in his diary: “I am learning of matters I had not known and recalling things I had forgotten.”
  • hindsight, one can only conclude that instead of helping the Japanese to understand and accept their past, the trial left them with an attitude of cynicism and resentment.
  • After it was over, the Nippon Times pointed out the flaws of the trial, but added that “the Japanese people must ponder over why it is that there has been such a discrepancy between what they thought and what the rest of the world accepted almost as common knowledge. This is at the root of the tragedy which Japan brought upon herself.”
  • Political trials produce politicized histories. This is what the revisionists mean when they talk about the Tokyo Trial View of History. And they are right, even if their own conclusions are not.
  • Frederick Mignone, one of the prosecutors, said a trifle histrionically that “in Japan and in the Orient in general, the trial is one of the most important phases of the occupation. It has received wide coverage in the Japanese press and revealed for the first time to millions of Japanese the scheming, duplicity, and insatiable desire for power of her entrenched militaristic leaders, writing a much-needed history of events which otherwise would not have been written.” It was indeed much-needed, since so little was known.
  • The president of the Tokyo tribunal, Sir William Webb, thought “the crimes of the German accused were far more heinous, varied and extensive than those of the Japanese accused.” Put in another way, nearly all the defendants at Nuremberg, convicted of crimes against peace, were also found guilty of crimes against humanity. But half the Japanese defendants received life sentences for political crimes only.
  • the question of responsibility is always a tricky affair in Japan, where formal responsibility is easier to identify than actual guilt. Not only were there many men, such as the hero of Kinoshita’s play, who took the blame for what their superiors had done—a common practice in Japan, in criminal gangs as well as in politics or business corporations—but the men at the top were often not at all in control of their unscrupulous subordinates.
  • “These men were not the hoodlums who were the powerful part of the group which stood before the tribunal at Nuremberg, dregs of a criminal environment, thoroughly schooled in the ways of crime and knowing no other methods but those of crime. These men were supposed to be the elite of the nation, the honest and trusted leaders to whom the fate of the nation had been confidently entrusted
  • many people were wrongly accused of the wrong things for the wrong reasons. This is why there was such sympathy in Japan for the men branded by foreigners as war criminals, particularly the so-called Class B and Class C criminals, the men who followed orders, or gave them at a lower level: field commanders, camp guards, and so on.
  • “The Japanese people are of the opinion that the actual goal of the war crimes tribunals was never realized, since the judgments were reached by the victors alone and had the character of revenge. The [Japanese] war criminal is not conscious of having committed a crime, for he regards his deeds as acts of war, committed out of patriotism.”
  • Yamashita Tomoyuki. Terrible atrocities were committed under his command in the Philippines. The sacking of Manila in 1945 was about as brutal as the Nanking Massacre. So to depict him in the movie as a peaceful gentleman, while portraying the American prosecutor in Manila as one of the main villains, might seem an odd way to view the past.
  • The Shrine ranks highest. It is the supreme symbol of authority, shouldered (like a shrine on festival days) by the Officials.
  • The political theorist Maruyama Masao called the prewar Japanese government a “system of irresponsibilities.” He identified three types of political personalities: the portable Shrine, the Official, and the Outlaw.
  • those who carry it, the Officials, are the ones with actual power. But the Officials—bureaucrats, politicians, admirals and generals—are often manipulated by the lowest-ranking Outlaws, the military mavericks, the hotheaded officers in the field, the mad nationalists, and other agents of violence.
  • But it was not entirely wrong, for the trial was rigged. Yamashita had no doubt been a tough soldier, but in this case he had been so far removed from the troops who ran amok in Manila that he could hardly have known what was going on. Yet the American prosecutor openly talked about his desire to hang “Japs.”
  • When the system spins out of control, as it did during the 1930s, events are forced by violent Outlaws, reacted to by nervous Officials, and justified by the sacred status of the Shrines.
  • Here we come to the nub of the problem, which the Tokyo trial refused to deal with, the role of the Shrine in whose name every single war crime was committed, Emperor Hirohito,
  • The historian Ienaga Saburo tells a story about a Japanese schoolchild in the 1930s who was squeamish about having to dissect a live frog. The teacher rapped him hard on the head with his knuckles and said: “Why are you crying about one lousy frog? When you grow up you’ll have to kill a hundred, two hundred Chinks.”
  • the lethal consequences of the emperor-worshipping system of irresponsibilities did emerge during the Tokyo trial. The savagery of Japanese troops was legitimized, if not driven, by an ideology that did not include a Final Solution but was as racialist as Hitler’s National Socialism. The Japanese were the Asian Herrenvolk, descended from the gods.
  • A veteran of the war in China said in a television interview that he was able to kill Chinese without qualms only because he didn’t regard them as human.
  • For to keep the emperor in place (he could at least have been made to resign), Hirohito’s past had to be freed from any blemish; the symbol had to be, so to speak, cleansed from what had been done in its name.
  • The same was true of the Japanese imperial institution, no matter who sat on the throne, a ruthless war criminal or a gentle marine biologist.
  • the chaplain at Sugamo prison, questioned Japanese camp commandants about their reasons for mistreating POWs. This is how he summed up their answers: “They had a belief that any enemy of the emperor could not be right, so the more brutally they treated their prisoners, the more loyal to their emperor they were being.”
  • The Mitscherlichs described Hitler as “an object on which Germans depended, to which they transferred responsibility, and he was thus an internal object. As such, he represented and revived the ideas of omnipotence that we all cherish about ourselves from infancy.
  • The fear after 1945 was that without the emperor Japan would be impossible to govern. In fact, MacArthur behaved like a traditional Japanese strongman (and was admired for doing so by many Japanese), using the imperial symbol to enhance his own power. As a result, he hurt the chances of a working Japanese democracy and seriously distorted history.
  • Aristides George Lazarus, the defense counsel of one of the generals on trial, was asked to arrange that “the military defendants, and their witnesses, would go out of their way during their testimony to include the fact that Hirohito was only a benign presence when military actions or programs were discussed at meetings that, by protocol, he had to attend.” No doubt the other counsel were given similar instructions. Only once during the trial
katyshannon

In Flint, Mich., there's so much lead in children's blood that a state of emergency is ... - 0 views

  • For months, worried parents in Flint, Mich., arrived at their pediatricians’ offices in droves. Holding a toddler by the hand or an infant in their arms, they all have the same question: Are their children being poisoned?
  • To find out, all it takes is a prick of the finger, a small letting of blood. If tests come back positive, the potentially severe consequences are far more difficult to discern.
  • That’s how lead works. It leaves its mark quietly, with a virtually invisible trail. But years later, when a child shows signs of a learning disability or behavioral issues, lead’s prior presence in the bloodstream suddenly becomes inescapable.
  • ...9 more annotations...
  • According to the World Health Organization, “lead affects children’s brain development resulting in reduced intelligence quotient (IQ), behavioral changes such as shortening of attention span and increased antisocial behavior, and reduced educational attainment. Lead exposure also causes anemia, hypertension, renal impairment, immunotoxicity and toxicity to the reproductive organs. The neurological and behavioral effects of lead are believed to be irreversible.”
  • The Hurley Medical Center, in Flint, released a study in September that confirmed what many Flint parents had feared for over a year: The proportion of infants and children with above-average levels of lead in their blood has nearly doubled since the city switched from the Detroit water system to using the Flint River as its water source, in 2014.
  • The crisis reached a nadir Monday night, when Flint Mayor Karen Weaver declared a state of emergency. “The City of Flint has experienced a Manmade disaster,” Weaver said in a declaratory statement. 1 of 11 Full Screen Autoplay Close Skip Ad × fa fa
  • The mayor — elected after her predecessor, Dayne Walling, experienced fallout from his administration’s handling of the water problems — said in the statement that she was seeking support from the federal government to deal with the “irreversible” effects of lead exposure on the city’s children. Weaver thinks that these health consequences will lead to a greater need for special education and mental health services, as well as developments in the juvenile justice system.
  • To those living in Flint, the announcement may feel as if it has been a long time coming. Almost immediately after the city started drawing from the Flint River in April 2014, residents began complaining about the water, which they said was cloudy in appearance and emitted a foul odor.
  • Since then, complications from the water coming from the Flint River have only piled up. Although city and state officials initially denied that the water was unsafe, the state issued a notice informing Flint residents that their water contained unlawful levels of trihalomethanes, a chlorine byproduct linked to cancer and other diseases.
  • Protesters marched to City Hall in the fierce Michigan cold, calling for officials to reconnect Flint’s water to the Detroit system. The use of the Flint River was supposed to be temporary, set to end in 2016 after a pipeline to Lake Huron’s Karegnondi Water Authority is finished.
  • Through continued demonstrations by Flint residents and mounting scientific evidence of the water’s toxins, city and state officials offered various solutions — from asking residents to boil their water to providing them with water filters — in an attempt to work around the need to reconnect to the Detroit system.
  • That call was finally made by Snyder (R) on Oct. 8. He announced that he had a plan for coming up with the $12 million to switch Flint back to the Detroit system. On Oct. 16, water started flowing again from Detroit to Flint.
aidenborst

Opinion: A company in Brazil made a controversial move to fight racism. Other CEOs shou... - 0 views

  • Although she's not a household name in the United States, billionaire Luiza Trajano, the richest woman in Brazil, might very well become one soon if her radical new model to confront structural racism takes hold.
  • Its coveted trainee program, long considered a major stepping stone into Brazil's corporate world, will now only admit Black Brazilians into its ranks in an effort to upend a system that oftentimes sidelines Brazilians of African heritage from rising up the corporate ladder.
  • The Magalu announcement quickly reverberated across the Brazilian media landscape. It was a bold move, no doubt, but not one without blowback; there have been calls across social media for a boycott of the company's stores.
  • ...12 more annotations...
  • Of course, such a move in the United States would immediately run afoul of long-established laws stemming from Title VII of the 1964 Civil Rights Act, which set up the EEOC (Equal Employment Opportunity Commission) to adjudicate race-based hiring, firing and promotional grievances. Seminal cases such as Griggs v. Duke Power Co. (1971), McDonnell Douglas Corp. v. Green (1973) and Hazelwood School District v. United States (1977), among many others, served to advance the legal structure through which American companies now deal with matters of race and equity in the workplace
  • Over time, these lawsuits gave EEO policies more teeth by defining a legal framework for ensuring workplace protections. They also forced companies to rewrite or get rid of unfair employment policies and practices.
  • However, the cruel irony of America's efforts to curb workplace discrimination is that once Title VII forcibly removed race from the hiring equation, it immediately became that much harder to enact programs to address systemic racism in ways that might be beneficial, which is why our country's long attempts at promoting affirmative action programs ultimately failed.
  • No matter how we got here, the current system is clearly not working; White males still account for the majority of executive positions. Among the CEOs of Fortune 500 companies, only 1% are Black.
  • America has a diversity problem, and our largest corporations need to embrace bold new models about how to accelerate social and racial justice within their ranks.
  • CEOs should start by stripping down America's foundational myth of meritocracy -- the notion that one's ability to get ahead in life is solely a function of the combined strength of their efforts and abilities -- and approach corporate recruiting from a new angle.
  • Several corporate programs, such as Starbucks' College Achievement Plan, have taken steps to make higher education more accessible for employees, but fall short of addressing the social, environmental and economic vectors that impinge upon disadvantaged youths.
  • What if growing up in a low-income, single-parent household, instead of being seen as an impediment to climbing the social ladder, positioned high-potential young teens for corporate-sponsored talent development programs that would support them from junior high, through high school and college and into the sponsor's corporate ranks? Such a program executed at scale would invariably lift up disadvantaged White youths as well, but that would be a feature, not a bug, making the entire initiative less controversial.
  • Despite the controversy around the decision, the Trajanos are not wavering. "We want to see more Black Brazilians in positions of leadership in Magalu; this diversity will make us a better company, capable of delivering a better return to our shareholders," Frederico Trajano wrote in a recent article.
  • "Today the racial make-up of Brazil is over 50% Black and Brown -- it basically looks like what the United States is projected to look like by 2050," observed Frederico Trajano in a recent Zoom interview with me. "American CEOs of large companies would be well-served by looking at what we are doing down here in Brazil on many fronts, including how to ensure that a company's leadership team better reflects the public it serves."
  • Here in the United States, Americans just elected the first woman of color, Kamala Harris, herself the daughter of Jamaican and Indian immigrants, as vice president
  • American CEOs should look south, and take their cues on racial justice from a bold businesswoman and her son from Brazil.
Javier E

There's No Such Thing As 'Sound Science' | FiveThirtyEight - 0 views

  • cience is being turned against itself. For decades, its twin ideals of transparency and rigor have been weaponized by those who disagree with results produced by the scientific method. Under the Trump administration, that fight has ramped up again.
  • The same entreaties crop up again and again: We need to root out conflicts. We need more precise evidence. What makes these arguments so powerful is that they sound quite similar to the points raised by proponents of a very different call for change that’s coming from within science.
  • Despite having dissimilar goals, the two forces espouse principles that look surprisingly alike: Science needs to be transparent. Results and methods should be openly shared so that outside researchers can independently reproduce and validate them. The methods used to collect and analyze data should be rigorous and clear, and conclusions must be supported by evidence.
  • ...26 more annotations...
  • they’re also used as talking points by politicians who are working to make it more difficult for the EPA and other federal agencies to use science in their regulatory decision-making, under the guise of basing policy on “sound science.” Science’s virtues are being wielded against it.
  • The sound science tactic exploits a fundamental feature of the scientific process: Science does not produce absolute certainty. Contrary to how it’s sometimes represented to the public, science is not a magic wand that turns everything it touches to truth. Instead, it’s a process of uncertainty reduction, much like a game of 20 Questions.
  • “Our criticisms are founded in a confidence in science,” said Steven Goodman, co-director of the Meta-Research Innovation Center at Stanford and a proponent of open science. “That’s a fundamental difference — we’re critiquing science to make it better. Others are critiquing it to devalue the approach itself.”
  • alls to base public policy on “sound science” seem unassailable if you don’t know the term’s history. The phrase was adopted by the tobacco industry in the 1990s to counteract mounting evidence linking secondhand smoke to cancer.
  • What distinguishes the two calls for transparency is intent: Whereas the “open science” movement aims to make science more reliable, reproducible and robust, proponents of “sound science” have historically worked to amplify uncertainty, create doubt and undermine scientific discoveries that threaten their interests.
  • Delay is a time-tested strategy. “Gridlock is the greatest friend a global warming skeptic has,” said Marc Morano, a prominent critic of global warming research
  • While insisting that they merely wanted to ensure that public policy was based on sound science, tobacco companies defined the term in a way that ensured that no science could ever be sound enough. The only sound science was certain science, which is an impossible standard to achieve.
  • “Doubt is our product,” wrote one employee of the Brown & Williamson tobacco company in a 1969 internal memo. The note went on to say that doubt “is the best means of competing with the ‘body of fact’” and “establishing a controversy.” These strategies for undermining inconvenient science were so effective that they’ve served as a sort of playbook for industry interests ever since
  • Doubt merchants aren’t pushing for knowledge, they’re practicing what Proctor has dubbed “agnogenesis” — the intentional manufacture of ignorance. This ignorance isn’t simply the absence of knowing something; it’s a lack of comprehension deliberately created by agents who don’t want you to know,
  • In the hands of doubt-makers, transparency becomes a rhetorical move. “It’s really difficult as a scientist or policy maker to make a stand against transparency and openness, because well, who would be against it?
  • But at the same time, “you can couch everything in the language of transparency and it becomes a powerful weapon.” For instance, when the EPA was preparing to set new limits on particulate pollution in the 1990s, industry groups pushed back against the research and demanded access to primary data (including records that researchers had promised participants would remain confidential) and a reanalysis of the evidence. Their calls succeeded and a new analysis was performed. The reanalysis essentially confirmed the original conclusions, but the process of conducting it delayed the implementation of regulations and cost researchers time and money.
  • Any given study can rarely answer more than one question at a time, and each study usually raises a bunch of new questions in the process of answering old ones. “Science is a process rather than an answer,” said psychologist Alison Ledgerwood of the University of California, Davis. Every answer is provisional and subject to change in the face of new evidence. It’s not entirely correct to say that “this study proves this fact,” Ledgerwood said. “We should be talking instead about how science increases or decreases our confidence in something.”
  • which has received funding from the oil and gas industry. “We’re the negative force. We’re just trying to stop stuff.”
  • these ploys are getting a fresh boost from Congress. The Data Quality Act (also known as the Information Quality Act) was reportedly written by an industry lobbyist and quietly passed as part of an appropriations bill in 2000. The rule mandates that federal agencies ensure the “quality, objectivity, utility, and integrity of information” that they disseminate, though it does little to define what these terms mean. The law also provides a mechanism for citizens and groups to challenge information that they deem inaccurate, including science that they disagree with. “It was passed in this very quiet way with no explicit debate about it — that should tell you a lot about the real goals,” Levy said.
  • in the 20 months following its implementation, the act was repeatedly used by industry groups to push back against proposed regulations and bog down the decision-making process. Instead of deploying transparency as a fundamental principle that applies to all science, these interests have used transparency as a weapon to attack very particular findings that they would like to eradicate.
  • Now Congress is considering another way to legislate how science is used. The Honest Act, a bill sponsored by Rep. Lamar Smith of Texas,3The bill has been passed by the House but still awaits a vote in the Senate. is another example of what Levy calls a “Trojan horse” law that uses the language of transparency as a cover to achieve other political goals. Smith’s legislation would severely limit the kind of evidence the EPA could use for decision-making. Only studies whose raw data and computer codes were publicly available would be allowed for consideration.
  • It might seem like an easy task to sort good science from bad, but in reality it’s not so simple. “There’s a misplaced idea that we can definitively distinguish the good from the not-good science, but it’s all a matter of degree,” said Brian Nosek, executive director of the Center for Open Science. “There is no perfect study.” Requiring regulators to wait until they have (nonexistent) perfect evidence is essentially “a way of saying, ‘We don’t want to use evidence for our decision-making,’
  • ost scientific controversies aren’t about science at all, and once the sides are drawn, more data is unlikely to bring opponents into agreement.
  • objective knowledge is not enough to resolve environmental controversies. “While these controversies may appear on the surface to rest on disputed questions of fact, beneath often reside differing positions of value; values that can give shape to differing understandings of what ‘the facts’ are.” What’s needed in these cases isn’t more or better science, but mechanisms to bring those hidden values to the forefront of the discussion so that they can be debated transparently. “As long as we continue down this unabashedly naive road about what science is, and what it is capable of doing, we will continue to fail to reach any sort of meaningful consensus on these matters,”
  • The dispute over tobacco was never about the science of cigarettes’ link to cancer. It was about whether companies have the right to sell dangerous products and, if so, what obligations they have to the consumers who purchased them.
  • Similarly, the debate over climate change isn’t about whether our planet is heating, but about how much responsibility each country and person bears for stopping it
  • While researching her book “Merchants of Doubt,” science historian Naomi Oreskes found that some of the same people who were defending the tobacco industry as scientific experts were also receiving industry money to deny the role of human activity in global warming. What these issues had in common, she realized, was that they all involved the need for government action. “None of this is about the science. All of this is a political debate about the role of government,”
  • These controversies are really about values, not scientific facts, and acknowledging that would allow us to have more truthful and productive debates. What would that look like in practice? Instead of cherry-picking evidence to support a particular view (and insisting that the science points to a desired action), the various sides could lay out the values they are using to assess the evidence.
  • For instance, in Europe, many decisions are guided by the precautionary principle — a system that values caution in the face of uncertainty and says that when the risks are unclear, it should be up to industries to show that their products and processes are not harmful, rather than requiring the government to prove that they are harmful before they can be regulated. By contrast, U.S. agencies tend to wait for strong evidence of harm before issuing regulations
  • the difference between them comes down to priorities: Is it better to exercise caution at the risk of burdening companies and perhaps the economy, or is it more important to avoid potential economic downsides even if it means that sometimes a harmful product or industrial process goes unregulated?
  • But science can’t tell us how risky is too risky to allow products like cigarettes or potentially harmful pesticides to be sold — those are value judgements that only humans can make.
Javier E

Opinion | There's a Reason There Aren't Enough Teachers in America. Many Reasons, Actua... - 0 views

  • Here are just a few of the longstanding problems plaguing American education: a generalized decline in literacy; the faltering international performance of American students; an inability to recruit enough qualified college graduates into the teaching profession; a lack of trained and able substitutes to fill teacher shortages; unequal access to educational resources; inadequate funding for schools; stagnant compensation for teachers; heavier workloads; declining prestige; and deteriorating faculty morale.
  • Nine-year-old students earlier this year revealed “the largest average score decline in reading since 1990, and the first ever score decline in mathematics,”
  • In the latest comparison of fourth grade reading ability, the United States ranked below 15 countries, including Russia, Ireland, Poland and Bulgaria.
  • ...49 more annotations...
  • Teachers are not only burnt out and undercompensated, they are also demoralized. They are being asked to do things in the name of teaching that they believe are mis-educational and harmful to students and the profession. What made this work good for them is no longer accessible. That is why we are hearing so many refrains of “I’m not leaving the profession, my profession left me.”
  • We find there are at least 36,000 vacant positions along with at least 163,000 positions being held by underqualified teachers, both of which are conservative estimates of the extent of teacher shortages nationally.
  • “The current problem of teacher shortages (I would further break this down into vacancy and under-qualification) is higher than normal.” The data, Nguyen continued, “indicate that shortages are worsening over time, particularly over the last few years
  • a growing gap between the pay of all college graduates and teacher salaries from 1979 to 2021, with a sharp increase in the differential since 2010
  • The number of qualified teachers is declining for the whole country and the vast majority of states.
  • Wages are essentially unchanged from 2000 to 2020 after adjusting for inflation. Teachers have about the same number of students. But, teacher accountability reforms have increased the demands on their positions.
  • The pandemic was very difficult for teachers. Their self-reported level of stress was about as twice as high during the pandemic compared to other working adults. Teachers had to worry both about their personal safety and deal with teaching/caring for students who are grieving lost family members.
  • the number of students graduating from college with bachelor’s degrees in education fell from 176,307 in 1970-71 to 104,008 in 2010-11 to 85,058 in 2019-20.
  • We do see that southern states (e.g., Mississippi, Alabama, Georgia and Florida) have very high vacancies and high vacancy rates.”
  • By 2021, teachers made $1,348, 32.9 percent less than what other graduates made, at $2,009.
  • These gaps play a significant role in determining the quality of teachers,
  • Sixty percent of teachers and 65 percent of principals reported believing that systemic racism exists. Only about 20 percent of teachers and principals reported that they believe systemic racism does not exist, and the remainder were not sure
  • “We find,” they write, “that teachers’ cognitive skills differ widely among nations — and that these differences matter greatly for students’ success in school. An increase of one standard deviation in teacher cognitive skills is associated with an increase of 10 to 15 percent of a standard deviation in student performance.”
  • teachers have lower cognitive skills, on average, in countries with greater nonteaching job opportunities for women in high-skill occupations and where teaching pays relatively less than other professions.
  • the scholars found that the cognitive skills of teachers in the United States fell in the middle ranks:Teachers in the United States perform worse than the average teacher sample-wide in numeracy, with a median score of 284 points out of a possible 500, compared to the sample-wide average of 292 points. In literacy, they perform slightly better than average, with a median score of 301 points compared to the sample-wide average of 295 points.
  • Increasing teacher numeracy skills by one standard deviation increases student performance by nearly 15 percent of a standard deviation on the PISA math test. Our estimate of the effect of increasing teacher literacy skills on students’ reading performance is slightly smaller, at 10 percent of a standard deviation.
  • How, then, to raise teacher skill level in the United States? Hanushek and his two colleagues have a simple answer: raise teacher pay to make it as attractive to college graduates as high-skill jobs in other fields.
  • policymakers will need to do more than raise teacher pay across the board to ensure positive results. They must ensure that higher salaries go to more effective teachers.
  • The teaching of disputed subjects in schools has compounded many of the difficulties in American education.
  • The researchers found that controversies over critical race theory, sex education and transgender issues — aggravated by divisive debates over responses to Covid and its aftermath — are inflicting a heavy toll on teachers and principals.
  • “On top of the herculean task of carrying out the essential functions of their jobs,” they write, “educators increasingly find themselves in the position of addressing contentious, politicized issues in their schools as the United States has experienced increasing political polarization.”
  • Teachers and principals, they add, “have been pulled in multiple directions as they try to balance and reconcile not only their own beliefs on such matters but also the beliefs of others around them, including their leaders, fellow staff, students, and students’ family members.”
  • These conflicting pressures take place in a climate where “emotions in response to these issues have run high within communities, resulting in the harassment of educators, bans against literature depicting diverse characters, and calls for increased parental involvement in deciding academic content.”
  • Forty-eight percent of principals and 40 percent of teachers reported that the intrusion of political issues and opinions in school leadership or teaching, respectively, was a job-related stressor. By comparison, only 16 percent of working adults indicated that the intrusion of political issues and opinions in their jobs was a source of job-related stress
  • In 1979, the average teacher weekly salary (in 2021 dollars) was $1,052, 22.9 percent less than other college graduates’, at $1,364
  • Nearly all Black or African American principals (92 percent) and teachers (87 percent) reported believing that systemic racism exists.
  • White educators working in predominantly white school systems reported substantially more pressure to deal with politically divisive issues than educators of color and those working in mostly minority schools: “Forty-one percent of white teachers and 52 percent of white teachers and principals selected the intrusion of political issues and opinions into their professions as a job-related stressor, compared with 36 percent of teachers of color and principals of color.
  • and opinions into their professions as a job-related stressor, compar
  • A 54 percent majority of teachers and principals said there “should not be legal limits on classroom conversations about racism, sexism, and other topics,” while 20 percent said there should be legislated constraint
  • Voters, in turn, are highly polarized on the teaching of issues impinging on race or ethnicity in public schools. The Education Next 2022 Survey asked, for example:Some people think their local public schools place too little emphasis on slavery, racism and other challenges faced by Black people in the United States. Other people think their local public schools place too much emphasis on these topics. What is your view about your local public schools?
  • Among Democrats, 55 percent said too little emphasis was placed on slavery, racism and other challenges faced by Black people, and 8 percent said too much.
  • Among Republicans, 51 said too much and 10 percent said too little.
  • Because of the lack of reliable national data, there is widespread disagreement among scholars of education over the scope and severity of the shortage of credentialed teachers, although there is more agreement that these problems are worse in low-income, high majority-minority school systems and in STEM and special education faculties.
  • Public schools increasingly are targets of conservative political groups focusing on what they term “Critical Race Theory,” as well as issues of sexuality and gender identity. These political conflicts have created a broad chilling effect that has limited opportunities for students to practice respectful dialogue on controversial topics and made it harder to address rampant misinformation.
  • The chilling effect also has led to marked declines in general support for teaching about race, racism, and racial and ethnic diversity.
  • These political conflicts, the authors wrote,have made the already hard work of public education more difficult, undermining school management, negatively impacting staff, and heightening student stress and anxiety. Several principals shared that they were reconsidering their own roles in public education in light of the rage at teachers and rage at administrators’ playing out in their communities.
  • State University of New York tracked trends on “four interrelated constructs: professional prestige, interest among students, preparation for entry, and job satisfaction” for 50 years, from the 1970s to the present and founda consistent and dynamic pattern across every measure: a rapid decline in the 1970s, a swift rise in the 1980s, relative stability for two decades, and a sustained drop beginning around 2010. The current state of the teaching profession is at or near its lowest levels in 50 years.
  • Who among the next generation of college graduates will choose to teach?
  • Perceptions of teacher prestige have fallen between 20 percent and 47 percent in the last decade to be at or near the lowest levels recorded over the last half century
  • Interest in the teaching profession among high school seniors and college freshmen has fallen 50 percent since the 1990s, and 38 percent since 2010, reaching the lowest level in the last 50 years
  • the proportion of college graduates that go into teaching is at a 50-year low
  • Teachers’ job satisfaction is also at the lowest level in five decades, with the percent of teachers who feel the stress of their job is worth it dropping from 81 percent to 42 percent in the last 15 years
  • The combination of these factors — declining prestige, lower pay than other professions that require a college education, increased workloads, and political and ideological pressures — is creating both intended and unintended consequences for teacher accountability reforms mandating tougher licensing rules, evaluations and skill testing.
  • Education policy over the past decade has focused considerable effort on improving human capital in schools through teacher accountability. These reforms, and the research upon which they drew, were based on strong assumptions about how accountability would affect who decided to become a teacher. Counter to most assumptions, our findings document how teacher accountability reduced the supply of new teacher candidates by, in part, decreasing perceived job security, satisfaction and autonomy.
  • The reforms, Kraft and colleagues continued, increasedthe likelihood that schools could not fill vacant teaching positions. Even more concerning, effects on unfilled vacancies were concentrated in hard-to-staff schools that often serve larger populations of low-income students and students of color
  • We find that evaluation reforms increased the quality of newly hired novice teachers by reducing the number of teachers that graduated from the least selective institutions
  • We find no evidence that evaluation reforms served to attract teachers who attended the most selective undergraduate institutions.
  • In other words, the economic incentives, salary structure and work-life pressures characteristic of public education employment have created a climate in which contemporary education reforms have perverse and unintended consequences that can worsen rather than alleviate the problems facing school systems.
  • If so, to improve the overall quality of the nation’s more than three million public schoolteachers, reformers may want to give priority to paychecks, working conditions, teacher autonomy and punishing workloads before attempting to impose higher standards, tougher evaluations and less job security.
lilyrashkind

DeSantis courts further controversy by honoring swimmer who finished second to Lia Thom... - 0 views

  • The Republican governor, already embroiled in a fight with Disney over the state's so-called "Don't Say Gay" bill, claimed that the NCAA is "perpetuating a fraud" and declared University of Virginia freshman and Florida native Emma Weyant the "rightful winner" of the race.Weyant had finished about 1.75 seconds behind Thomas, who has come to personify the ongoing discourse on trans women's participation in sports and the balance between inclusion and fair play."The NCAA is basically taking efforts to destroy women's athletics," the Republican governor said in a news conference. "They're trying to undermine the integrity of the competition and crown someone else."
  • field.Read MoreTuesday's proclamation comes against the backdrop of DeSantis' showdown with Disney over the controversial Florida bill that would ban classroom instruction about sexual orientation and gender identity before fourth grade. A day after Disney CEO Bob Chapek publicly condemned the legislation -- which DeSantis has said he will sign into law -- the Florida governor ripped Disney as a "woke corporation" to a room of supporters.
  • "In Florida, we reject these lies and recognize Sarasota's Emma Weyant as the best women's swimmer in the 500y freestyle," he said in a tweet.
  • ...2 more annotations...
  • While sex is a category that refers broadly to physiology, a person's gender is an innate sense of identity. The factors that go into determining the sex listed on a birth certificate may include anatomy, genetics and hormones, and there is broad natural variation in each of these categories. For this reason, critics have said the language of "biological sex," as used in DeSantis' proclamation, is overly simplistic and misleading.A 2017 report in the journal Sports Medicine that reviewed several related studies found "no direct or consistent research" on trans people having an athletic advantage over their cisgender peers, at any state of their transition, and critics say postures like DeSantis' will only add to the discrimination that trans people face, particularly trans youth.
  • So far this year, Iowa and South Dakota have approved legislation banning transgender women and girls from participating on sports teams consistent with their gender at accredited schools and colleges. And last year, Alabama, Arkansas, Florida, Mississippi, Montana, Tennessee, Texas and West Virginia enacted similar sports bans, infuriating LGBTQ advocates, who argue conservatives are creating an issue where there isn't one.
Javier E

George Orwell: The Prevention of Literature - The Atlantic - 0 views

  • the much more tenable and dangerous proposition that freedom is undesirable and that intellectual honesty is a form of antisocial selfishness
  • the controversy over freedom of speech and of the press is at bottom a controversy over the desirability, or otherwise, of telling lies.
  • What is really at issue is the right to report contemporary events truthfully, or as truthfully as is consistent with the ignorance, bias, and self-deception from which every observer necessarily suffers
  • ...10 more annotations...
  • it is necessary to strip away the irrelevancies in which this controversy is usually wrapped up.
  • The enemies of intellectual liberty always try to present their case as a plea for discipline versus individualism.
  • The issue truth-versus-untruth is as far as possible kept in the background.
  • the writer who refuses to sell his opinions is always branded as a mere egoist, He is accused, that is, either of wanting to shut himself up in an ivory tower, or of making an exhibitionist display of his own personality, or of resisting the inevitable current, of history in an attempt to cling to unjustified privileges.
  • Each of them tacitly claims that “the truth” has already been revealed, and that the heretic, if he is not simply a fool, is secretly aware of “the truth” and merely resists it out of selfish motives.
  • Freedom of the intellect means the freedom to report what one has seen, heard, and fell, and not to be obliged to fabricate imaginary facts and feelings.
  • known facts are suppressed and distorted to such an extent as to make it doubtful whether a true history of our times can ever be written.
  • A totalitarian state is in effect a theocracy, and its ruling caste, in order to keep its position, has to be thought of as infallible. But since, in practice, no one is infallible, it is frequently necessary to rearrange past events in order to show that this or that mistake was not made, or that this or that imaginary triumph actually happened
  • Then, again, every major change in policy demands a corresponding change of doctrine and a revaluation of prominent historical figures. This kind of thing happens everywhere, but clearly it is likelier to lead to outright falsification in societies where only one opinion is permissible at any given moment.
  • The friends of totalitarianism in England usually tend to argue that since absolute truth is not attainable, a big lie is no worse than a little lie. It is pointed out that all historical records are biased and inaccurate, or, on the other hand, that modem physics has proved that what seems to us the real world is an illusion, so that to believe in the evidence of one’s senses is simply vulgar philistinism.
Javier E

Losing Earth: The Decade We Almost Stopped Climate Change - The New York Times - 0 views

  • As Malcolm Forbes Baldwin, the acting chairman of the president’s Council for Environmental Quality, told industry executives in 1981, “There can be no more important or conservative concern than the protection of the globe itself.”
  • Among those who called for urgent, immediate and far-reaching climate policy were Senators John Chafee, Robert Stafford and David Durenberger; the E.P.A. administrator, William K. Reilly; and, during his campaign for president, George H.W. Bush.
  • It was understood that action would have to come immediately. At the start of the 1980s, scientists within the federal government predicted that conclusive evidence of warming would appear on the global temperature record by the end of the decade, at which point it would be too late to avoid disaster.
  • ...180 more annotations...
  • If the world had adopted the proposal widely endorsed at the end of the ’80s — a freezing of carbon emissions, with a reduction of 20 percent by 2005 — warming could have been held to less than 1.5 degrees.
  • Action had to be taken, and the United States would need to lead. It didn’t.
  • There can be no understanding of our current and future predicament without understanding why we failed to solve this problem when we had the chance.
  • The first suggestion to Rafe Pomerance that humankind was destroying the conditions necessary for its own survival came on Page 66 of the government publication EPA-600/7-78-019. It was a technical report about coal
  • ‘This Is the Whole Banana’ Spring 1979
  • here was an urgent problem that demanded their attention, MacDonald believed, because human civilization faced an existential crisis. In “How to Wreck the Environment,” a 1968 essay published while he was a science adviser to Lyndon Johnson, MacDonald predicted a near future in which “nuclear weapons were effectively banned and the weapons of mass destruction were those of environmental catastrophe.” One of the most potentially devastating such weapons, he believed, was the gas that we exhaled with every breath: carbon dioxide. By vastly increasing carbon emissions, the world’s most advanced militaries could alter weather patterns and wreak famine, drought and economic collapse.
  • the Jasons. They were like one of those teams of superheroes with complementary powers that join forces in times of galactic crisis. They had been brought together by federal agencies, including the C.I.A, to devise scientific solutions to national-security problems: how to detect an incoming missile; how to predict fallout from a nuclear bomb; how to develop unconventional weapons, like plague-infested rats.
  • Agle pointed to an article about a prominent geophysicist named Gordon MacDonald, who was conducting a study on climate change with the Jasons, the mysterious coterie of elite scientists to which he belonged
  • During the spring of 1977 and the summer of 1978, the Jasons met to determine what would happen once the concentration of carbon dioxide in the atmosphere doubled from pre-Industrial Revolution levels. It was an arbitrary milestone, the doubling, but a useful one, as its inevitability was not in question; the threshold would most likely be breached by 2035.
  • The Jasons’ report to the Department of Energy, “The Long-Term Impact of Atmospheric Carbon Dioxide on Climate,” was written in an understated tone that only enhanced its nightmarish findings: Global temperatures would increase by an average of two to three degrees Celsius; Dust Bowl conditions would “threaten large areas of North America, Asia and Africa”; access to drinking water and agricultural production would fall, triggering mass migration on an unprecedented scale. “Perhaps the most ominous feature,” however, was the effect of a changing climate on the poles. Even a minimal warming “could lead to rapid melting” of the West Antarctic ice sheet. The ice sheet contained enough water to raise the level of the oceans 16 feet.
  • MacDonald explained that he first studied the carbon-dioxide issue when he was about Pomerance’s age — in 1961, when he served as an adviser to John F. Kennedy. Pomerance pieced together that MacDonald, in his youth, had been something of a prodigy: In his 20s, he advised Dwight D. Eisenhower on space exploration; at 32, he became a member of the National Academy of Sciences; at 40, he was appointed to the inaugural Council on Environmental Quality, where he advised Richard Nixon on the environmental dangers of burning coal. He monitored the carbon-dioxide problem the whole time, with increasing alarm.
  • They were surprised to learn how few senior officials were familiar with the Jasons’ findings, let alone understood the ramifications of global warming. At last, having worked their way up the federal hierarchy, the two went to see the president’s top scientist, Frank Press.
  • Thus began the Gordon and Rafe carbon-dioxide roadshow. Beginning in the spring of 1979, Pomerance arranged informal briefings with the E.P.A., the National Security Council, The New York Times, the Council on Environmental Quality and the Energy Department, which, Pomerance learned, had established an Office of Carbon Dioxide Effects two years earlier at MacDonald’s urging
  • . Out of respect for MacDonald, Press had summoned to their meeting what seemed to be the entire senior staff of the president’s Office of Science and Technology Policy — the officials consulted on every critical matter of energy and national security. What Pomerance had expected to be yet another casual briefing assumed the character of a high-level national-security meeting.
  • MacDonald would begin his presentation by going back more than a century to John Tyndall — an Irish physicist who was an early champion of Charles Darwin’s work and died after being accidentally poisoned by his wife. In 1859, Tyndall found that carbon dioxide absorbed heat and that variations in the composition of the atmosphere could create changes in climate. These findings inspired Svante Arrhenius, a Swedish chemist and future Nobel laureate, to deduce in 1896 that the combustion of coal and petroleum could raise global temperatures. This warming would become noticeable in a few centuries, Arrhenius calculated, or sooner if consumption of fossil fuels continued to increase.
  • Four decades later, a British steam engineer named Guy Stewart Callendar discovered that, at the weather stations he observed, the previous five years were the hottest in recorded history. Humankind, he wrote in a paper, had become “able to speed up the processes of Nature.” That was in 1939.
  • MacDonald’s history concluded with Roger Revelle, perhaps the most distinguished of the priestly caste of government scientists who, since the Manhattan Project, advised every president on major policy; he had been a close colleague of MacDonald and Press since they served together under Kennedy. In a 1957 paper written with Hans Suess, Revelle concluded that “human beings are now carrying out a large-scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future.” Revelle helped the Weather Bureau establish a continuous measurement of atmospheric carbon dioxide at a site perched near the summit of Mauna Loa on the Big Island of Hawaii, 11,500 feet above the sea — a rare pristine natural laboratory on a planet blanketed by fossil-fuel emissions.
  • After nearly a decade of observation, Revelle had shared his concerns with Lyndon Johnson, who included them in a special message to Congress two weeks after his inauguration. Johnson explained that his generation had “altered the composition of the atmosphere on a global scale” through the burning of fossil fuels, and his administration commissioned a study of the subject by his Science Advisory Committee. Revelle was its chairman, and its 1965 executive report on carbon dioxide warned of the rapid melting of Antarctica, rising seas, increased acidity of fresh waters — changes that would require no less than a coordinated global effort to forestall.Yet emissions continued to rise, and at this rate, MacDonald warned, they could see a snowless New England, the swamping of major coastal cities, as much as a 40 percent decline in national wheat production, the forced migration of about one-quarter of the world’s population. Not within centuries — within their own lifetimes.
  • On May 22, Press wrote a letter to the president of the National Academy of Sciences requesting a full assessment of the carbon-dioxide issue. Jule Charney, the father of modern meteorology, would gather the nation’s top oceanographers, atmospheric scientists and climate modelers to judge whether MacDonald’s alarm was justified — whether the world was, in fact, headed to cataclysm.
  • If Charney’s group confirmed that the world was careering toward an existential crisis, the president would be forced to act.
  • Hansen turned from the moon to Venus. Why, he tried to determine, was its surface so hot? In 1967, a Soviet satellite beamed back the answer: The planet’s atmosphere was mainly carbon dioxide. Though once it may have had habitable temperatures, it was believed to have succumbed to a runaway greenhouse effect: As the sun grew brighter, Venus’s ocean began to evaporate, thickening the atmosphere, which forced yet greater evaporation — a self-perpetuating cycle that finally boiled off the ocean entirely and heated the planet’s surface to more than 800 degrees Fahrenheit
  • At the other extreme, Mars’s thin atmosphere had insufficient carbon dioxide to trap much heat at all, leaving it about 900 degrees colder. Earth lay in the middle, its Goldilocks greenhouse effect just strong enough to support life.
  • We want to learn more about Earth’s climate, Jim told Anniek — and how humanity can influence it. He would use giant new supercomputers to map the planet’s atmosphere. They would create Mirror Worlds: parallel realities that mimicked our own. These digital simulacra, technically called “general circulation models,” combined the mathematical formulas that governed the behavior of the sea, land and sky into a single computer model. Unlike the real world, they could be sped forward to reveal the future.
  • The government officials, many of them scientists themselves, tried to suppress their awe of the legends in their presence: Henry Stommel, the world’s leading oceanographer; his protégé, Carl Wunsch, a Jason; the Manhattan Project alumnus Cecil Leith; the Harvard planetary physicist Richard Goody. These were the men who, in the last three decades, had discovered foundational principles underlying the relationships among sun, atmosphere, land and ocean — which is to say, the climate.
  • When, at Charney’s request, Hansen programmed his model to consider a future of doubled carbon dioxide, it predicted a temperature increase of four degrees Celsius. That was twice as much warming as the prediction made by the most prominent climate modeler, Syukuro Manabe, whose government lab at Princeton was the first to model the greenhouse effect. The difference between the two predictions — between warming of two degrees Celsius and four degrees Celsius — was the difference between damaged coral reefs and no reefs whatsoever, between thinning forests and forests enveloped by desert, between catastrophe and chaos.
  • The discrepancy between the models, Arakawa concluded, came down to ice and snow. The whiteness of the world’s snowfields reflected light; if snow melted in a warmer climate, less radiation would escape the atmosphere, leading to even greater warming. Shortly before dawn, Arakawa concluded that Manabe had given too little weight to the influence of melting sea ice, while Hansen had overemphasized it. The best estimate lay in between. Which meant that the Jasons’ calculation was too optimistic. When carbon dioxide doubled in 2035 or thereabouts, global temperatures would increase between 1.5 and 4.5 degrees Celsius, with the most likely outcome a warming of three degrees.
  • within the highest levels of the federal government, the scientific community and the oil-and-gas industry — within the commonwealth of people who had begun to concern themselves with the future habitability of the planet — the Charney report would come to have the authority of settled fact. It was the summation of all the predictions that had come before, and it would withstand the scrutiny of the decades that followed it. Charney’s group had considered everything known about ocean, sun, sea, air and fossil fuels and had distilled it to a single number: three. When the doubling threshold was broached, as appeared inevitable, the world would warm three degrees Celsius
  • The last time the world was three degrees warmer was during the Pliocene, three million years ago, when beech trees grew in Antarctica, the seas were 80 feet higher and horses galloped across the Canadian coast of the Arctic Ocean.
  • After the publication of the Charney report, Exxon decided to create its own dedicated carbon-dioxide research program, with an annual budget of $600,000. Only Exxon was asking a slightly different question than Jule Charney. Exxon didn’t concern itself primarily with how much the world would warm. It wanted to know how much of the warming Exxon could be blamed for.
  • “It behooves us to start a very aggressive defensive program,” Shaw wrote in a memo to a manager, “because there is a good probability that legislation affecting our business will be passed.”
  • Shaw turned to Wallace Broecker, a Columbia University oceanographer who was the second author of Roger Revelle’s 1965 carbon-dioxide report for Lyndon Johnson. In 1977, in a presentation at the American Geophysical Union, Broecker predicted that fossil fuels would have to be restricted, whether by taxation or fiat. More recently, he had testified before Congress, calling carbon dioxide “the No.1 long-term environmental problem.” If presidents and senators trusted Broecker to tell them the bad news, he was good enough for Exxon.
  • The company had been studying the carbon-dioxide problem for decades, since before it changed its name to Exxon. In 1957, scientists from Humble Oil published a study tracking “the enormous quantity of carbon dioxide” contributed to the atmosphere since the Industrial Revolution “from the combustion of fossil fuels.” Even then, the observation that burning fossil fuels had increased the concentration of carbon in the atmosphere was well understood and accepted by Humble’s scientists.
  • The American Petroleum Institute, the industry’s largest trade association, asked the same question in 1958 through its air-pollution study group and replicated the findings made by Humble Oil. So did another A.P.I. study conducted by the Stanford Research Institute a decade later, in 1968, which concluded that the burning of fossil fuels would bring “significant temperature changes” by the year 2000 and ultimately “serious worldwide environmental changes,” including the melting of the Antarctic ice cap and rising seas.
  • The ritual repeated itself every few years. Industry scientists, at the behest of their corporate bosses, reviewed the problem and found good reasons for alarm and better excuses to do nothing. Why should they act when almost nobody within the United States government — nor, for that matter, within the environmental movement — seemed worried?
  • Why take on an intractable problem that would not be detected until this generation of employees was safely retired? Worse, the solutions seemed more punitive than the problem itself. Historically, energy use had correlated to economic growth — the more fossil fuels we burned, the better our lives became. Why mess with that?
  • That June, Jimmy Carter signed the Energy Security Act of 1980, which directed the National Academy of Sciences to start a multiyear, comprehensive study, to be called “Changing Climate,” that would analyze social and economic effects of climate change. More urgent, the National Commission on Air Quality, at the request of Congress, invited two dozen experts, including Henry Shaw himself, to a meeting in Florida to propose climate policy.
  • On April 3, 1980, Senator Paul Tsongas, a Massachusetts Democrat, held the first congressional hearing on carbon-dioxide buildup in the atmosphere. Gordon MacDonald testified that the United States should “take the initiative” and develop, through the United Nations, a way to coordinate every nation’s energy policies to address the problem.
  • During the expansion of the Clean Air Act, he pushed for the creation of the National Commission on Air Quality, charged with ensuring that the goals of the act were being met. One such goal was a stable global climate. The Charney report had made clear that goal was not being met, and now the commission wanted to hear proposals for legislation. It was a profound responsibility, and the two dozen experts invited to the Pink Palace — policy gurus, deep thinkers, an industry scientist and an environmental activist — had only three days to achieve it, but the utopian setting made everything seem possible
  • We have less time than we realize, said an M.I.T. nuclear engineer named David Rose, who studied how civilizations responded to large technological crises. “People leave their problems until the 11th hour, the 59th minute,” he said. “And then: ‘Eloi, Eloi, Lama Sabachthani?’ ” — “My God, my God, why hast thou forsaken me?”
  • The attendees seemed to share a sincere interest in finding solutions. They agreed that some kind of international treaty would ultimately be needed to keep atmospheric carbon dioxide at a safe level. But nobody could agree on what that level was.
  • William Elliott, a NOAA scientist, introduced some hard facts: If the United States stopped burning carbon that year, it would delay the arrival of the doubling threshold by only five years. If Western nations somehow managed to stabilize emissions, it would forestall the inevitable by only eight years. The only way to avoid the worst was to stop burning coal. Yet China, the Soviet Union and the United States, by far the world’s three largest coal producers, were frantically accelerating extraction.
  • “Do we have a problem?” asked Anthony Scoville, a congressional science consultant. “We do, but it is not the atmospheric problem. It is the political problem.” He doubted that any scientific report, no matter how ominous its predictions, would persuade politicians to act.
  • The talk of ending oil production stirred for the first time the gentleman from Exxon. “I think there is a transition period,” Henry Shaw said. “We are not going to stop burning fossil fuels and start looking toward solar or nuclear fusion and so on. We are going to have a very orderly transition from fossil fuels to renewable energy sources.”
  • What if the problem was that they were thinking of it as a problem? “What I am saying,” Scoville continued, “is that in a sense we are making a transition not only in energy but the economy as a whole.” Even if the coal and oil industries collapsed, renewable technologies like solar energy would take their place. Jimmy Carter was planning to invest $80 billion in synthetic fuel. “My God,” Scoville said, “with $80 billion, you could have a photovoltaics industry going that would obviate the need for synfuels forever!”
  • nobody could agree what to do. John Perry, a meteorologist who had worked as a staff member on the Charney report, suggested that American energy policy merely “take into account” the risks of global warming, though he acknowledged that a nonbinding measure might seem “intolerably stodgy.” “It is so weak,” Pomerance said, the air seeping out of him, “as to not get us anywhere.”
  • Scoville pointed out that the United States was responsible for the largest share of global carbon emissions. But not for long. “If we’re going to exercise leadership,” he said, “the opportunity is now.
  • One way to lead, he proposed, would be to classify carbon dioxide as a pollutant under the Clean Air Act and regulate it as such. This was received by the room like a belch. By Scoville’s logic, every sigh was an act of pollution. Did the science really support such an extreme measure? The Charney report did exactly that, Pomerance said.
  • Slade, the director of the Energy Department’s carbon-dioxide program, considered the lag a saving grace. If changes did not occur for a decade or more, he said, those in the room couldn’t be blamed for failing to prevent them. So what was the problem?
  • “Call it whatever.” Besides, Pomerance added, they didn’t have to ban coal tomorrow. A pair of modest steps could be taken immediately to show the world that the United States was serious: the implementation of a carbon tax and increased investment in renewable energy. Then the United States could organize an international summit meeting to address climate change
  • these two dozen experts, who agreed on the major points and had made a commitment to Congress, could not draft a single paragraph. Hours passed in a hell of fruitless negotiation, self-defeating proposals and impulsive speechifying. Pomerance and Scoville pushed to include a statement calling for the United States to “sharply accelerate international dialogue,” but they were sunk by objections and caveats.
  • They never got to policy proposals. They never got to the second paragraph. The final statement was signed by only the moderator, who phrased it more weakly than the declaration calling for the workshop in the first place. “The guide I would suggest,” Jorling wrote, “is whether we know enough not to recommend changes in existing policy.”
  • Pomerance had seen enough. A consensus-based strategy would not work — could not work — without American leadership. And the United States wouldn’t act unless a strong leader persuaded it to do so — someone who would speak with authority about the science, demand action from those in power and risk everything in pursuit of justice.
  • The meeting ended Friday morning. On Tuesday, four days later, Ronald Reagan was elected president.
  • ‘Otherwise, They’ll Gurgle’ November 1980-September 1981
  • In the midst of this carnage, the Council on Environmental Quality submitted a report to the White House warning that fossil fuels could “permanently and disastrously” alter Earth’s atmosphere, leading to “a warming of the Earth, possibly with very serious effects.” Reagan did not act on the council’s advice. Instead, his administration considered eliminating the council.
  • After the election, Reagan considered plans to close the Energy Department, increase coal production on federal land and deregulate surface coal mining. Once in office, he appointed James Watt, the president of a legal firm that fought to open public lands to mining and drilling, to run the Interior Department. “We’re deliriously happy,” the president of the National Coal Association was reported to have said. Reagan preserved the E.P.A. but named as its administrator Anne Gorsuch, an anti-regulation zealot who proceeded to cut the agency’s staff and budget by about a quarter
  • Reagan “has declared open war on solar energy,” the director of the nation’s lead solar-energy research agency said, after he was asked to resign). Reagan appeared determined to reverse the environmental achievements of Jimmy Carter, before undoing those of Richard Nixon, Lyndon Johnson, John F. Kennedy and, if he could get away with it, Theodore Roosevelt.
  • When Reagan considered closing the Council on Environmental Quality, its acting chairman, Malcolm Forbes Baldwin, wrote to the vice president and the White House chief of staff begging them to reconsider; in a major speech the same week, “A Conservative’s Program for the Environment,” Baldwin argued that it was “time for today’s conservatives explicitly to embrace environmentalism.” Environmental protection was not only good sense. It was good business. What could be more conservative than an efficient use of resources that led to fewer federal subsidies?
  • Meanwhile the Charney report continued to vibrate at the periphery of public consciousness. Its conclusions were confirmed by major studies from the Aspen Institute, the International Institute for Applied Systems Analysis near Vienna and the American Association for the Advancement of Science. Every month or so, nationally syndicated articles appeared summoning apocalypse: “Another Warning on ‘Greenhouse Effect,’ ” “Global Warming Trend ‘Beyond Human Experience,’ ” “Warming Trend Could ‘Pit Nation Against Nation.’
  • Pomerance read on the front page of The New York Times on Aug. 22, 1981, about a forthcoming paper in Science by a team of seven NASA scientists. They had found that the world had already warmed in the past century. Temperatures hadn’t increased beyond the range of historical averages, but the scientists predicted that the warming signal would emerge from the noise of routine weather fluctuations much sooner than previously expected. Most unusual of all, the paper ended with a policy recommendation: In the coming decades, the authors wrote, humankind should develop alternative sources of energy and use fossil fuels only “as necessary.” The lead author was James Hansen.
  • Pomerance listened and watched. He understood Hansen’s basic findings well enough: Earth had been warming since 1880, and the warming would reach “almost unprecedented magnitude” in the next century, leading to the familiar suite of terrors, including the flooding of a 10th of New Jersey and a quarter of Louisiana and Florida. But Pomerance was excited to find that Hansen could translate the complexities of atmospheric science into plain English.
  • 7. ‘We’re All Going to Be the Victims’ March 1982
  • Gore had learned about climate change a dozen years earlier as an undergraduate at Harvard, when he took a class taught by Roger Revelle. Humankind was on the brink of radically transforming the global atmosphere, Revelle explained, drawing Keeling’s rising zigzag on the blackboard, and risked bringing about the collapse of civilization. Gore was stunned: Why wasn’t anyone talking about this?
  • Most in Congress considered the science committee a legislative backwater, if they considered it at all; this made Gore’s subcommittee, which had no legislative authority, an afterthought to an afterthought. That, Gore vowed, would change. Environmental and health stories had all the elements of narrative drama: villains, victims and heroes. In a hearing, you could summon all three, with the chairman serving as narrator, chorus and moral authority. He told his staff director that he wanted to hold a hearing every week.
  • The Revelle hearing went as Grumbly had predicted. The urgency of the issue was lost on Gore’s older colleagues, who drifted in and out while the witnesses testified. There were few people left by the time the Brookings Institution economist Lester Lave warned that humankind’s profligate exploitation of fossil fuels posed an existential test to human nature. “Carbon dioxide stands as a symbol now of our willingness to confront the future,” he said. “It will be a sad day when we decide that we just don’t have the time or thoughtfulness to address those issues.”
  • That night, the news programs featured the resolution of the baseball strike, the ongoing budgetary debate and the national surplus of butter.
  • There emerged, despite the general comity, a partisan divide. Unlike the Democrats, the Republicans demanded action. “Today I have a sense of déjà vu,” said Robert Walker, a Republican from Pennsylvania. In each of the last five years, he said, “we have been told and told and told that there is a problem with the increasing carbon dioxide in the atmosphere. We all accept that fact, and we realize that the potential consequences are certainly major in their impact on mankind.” Yet they had failed to propose a single law. “Now is the time,” he said. “The research is clear. It is up to us now to summon the political will.”
  • Hansen flew to Washington to testify on March 25, 1982, performing before a gallery even more thinly populated than at Gore’s first hearing on the greenhouse effect. Gore began by attacking the Reagan administration for cutting funding for carbon-dioxide research despite the “broad consensus in the scientific community that the greenhouse effect is a reality.” William Carney, a Republican from New York, bemoaned the burning of fossil fuels and argued passionately that science should serve as the basis for legislative policy
  • the experts invited by Gore agreed with the Republicans: The science was certain enough. Melvin Calvin, a Berkeley chemist who won the Nobel Prize for his work on the carbon cycle, said that it was useless to wait for stronger evidence of warming. “You cannot do a thing about it when the signals are so big that they come out of the noise,” he said. “You have to look for early warning signs.”
  • Hansen’s job was to share the warning signs, to translate the data into plain English. He explained a few discoveries that his team had made — not with computer models but in libraries. By analyzing records from hundreds of weather stations, he found that the surface temperature of the planet had already increased four-tenths of a degree Celsius in the previous century. Data from several hundred tide-gauge stations showed that the oceans had risen four inches since the 1880s
  • It occurred to Hansen that this was the only political question that mattered: How long until the worst began? It was not a question on which geophysicists expended much effort; the difference between five years and 50 years in the future was meaningless in geologic time. Politicians were capable of thinking only in terms of electoral time: six years, four years, two years. But when it came to the carbon problem, the two time schemes were converging.
  • “Within 10 or 20 years,” Hansen said, “we will see climate changes which are clearly larger than the natural variability.” James Scheuer wanted to make sure he understood this correctly. No one else had predicted that the signal would emerge that quickly. “If it were one or two degrees per century,” he said, “that would be within the range of human adaptability. But we are pushing beyond the range of human adaptability.” “Yes,” Hansen said.
  • How soon, Scheuer asked, would they have to change the national model of energy production? Hansen hesitated — it wasn’t a scientific question. But he couldn’t help himself. He had been irritated, during the hearing, by all the ludicrous talk about the possibility of growing more trees to offset emissions. False hopes were worse than no hope at all: They undermined the prospect of developing real solutions. “That time is very soon,” Hansen said finally. “My opinion is that it is past,” Calvin said, but he was not heard because he spoke from his seat. He was told to speak into the microphone. “It is already later,” Calvin said, “than you think.”
  • From Gore’s perspective, the hearing was an unequivocal success. That night Dan Rather devoted three minutes of “CBS Evening News” to the greenhouse effect. A correspondent explained that temperatures had increased over the previous century, great sheets of pack ice in Antarctica were rapidly melting, the seas were rising; Calvin said that “the trend is all in the direction of an impending catastrophe”; and Gore mocked Reagan for his shortsightedness. Later, Gore could take credit for protecting the Energy Department’s carbon-dioxide program, which in the end was largely preserved.
  • 8. ‘The Direction of an Impending Catastrophe’ 1982
  • Following Henry Shaw’s recommendation to establish credibility ahead of any future legislative battles, Exxon had begun to spend conspicuously on global-warming research. It donated tens of thousands of dollars to some of the most prominent research efforts, including one at Woods Hole led by the ecologist George Woodwell, who had been calling for major climate policy as early as the mid-1970s, and an international effort coordinated by the United Nations. Now Shaw offered to fund the October 1982 symposium on climate change at Columbia’s Lamont-Doherty campus.
  • David boasted that Exxon would usher in a new global energy system to save the planet from the ravages of climate change. He went so far as to argue that capitalism’s blind faith in the wisdom of the free market was “less than satisfying” when it came to the greenhouse effect. Ethical considerations were necessary, too. He pledged that Exxon would revise its corporate strategy to account for climate change, even if it were not “fashionable” to do so. As Exxon had already made heavy investments in nuclear and solar technology, he was “generally upbeat” that Exxon would “invent” a future of renewable energy.
  • Hansen had reason to feel upbeat himself. If the world’s largest oil-and-gas company supported a new national energy model, the White House would not stand in its way. The Reagan administration was hostile to change from within its ranks. But it couldn’t be hostile to Exxon.
  • The carbon-dioxide issue was beginning to receive major national attention — Hansen’s own findings had become front-page news, after all. What started as a scientific story was turning into a political story.
  • The political realm was itself a kind of Mirror World, a parallel reality that crudely mimicked our own. It shared many of our most fundamental laws, like the laws of gravity and inertia and publicity. And if you applied enough pressure, the Mirror World of politics could be sped forward to reveal a new future. Hansen was beginning to understand that too.
  • 1. ‘Caution, Not Panic’ 1983-1984
  • in the fall of 1983, the climate issue entered an especially long, dark winter. And all because of a single report that had done nothing to change the state of climate science but transformed the state of climate politics.
  • After the publication of the Charney report in 1979, Jimmy Carter had directed the National Academy of Sciences to prepare a comprehensive, $1 million analysis of the carbon-dioxide problem: a Warren Commission for the greenhouse effect. A team of scientist-dignitaries — among them Revelle, the Princeton modeler Syukuro Manabe and the Harvard political economist Thomas Schelling, one of the intellectual architects of Cold War game theory — would review the literature, evaluate the consequences of global warming for the world order and propose remedies
  • Then Reagan won the White House.
  • the incipient report served as the Reagan administration’s answer to every question on the subject. There could be no climate policy, Fred Koomanoff and his associates said, until the academy ruled. In the Mirror World of the Reagan administration, the warming problem hadn’t been abandoned at all. A careful, comprehensive solution was being devised. Everyone just had to wait for the academy’s elders to explain what it was.
  • The committee’s chairman, William Nierenberg — a Jason, presidential adviser and director of Scripps, the nation’s pre-eminent oceanographic institution — argued that action had to be taken immediately, before all the details could be known with certainty, or else it would be too late.
  • Better to bet on American ingenuity to save the day. Major interventions in national energy policy, taken immediately, might end up being more expensive, and less effective, than actions taken decades in the future, after more was understood about the economic and social consequences of a warmer planet. Yes, the climate would change, mostly for the worst, but future generations would be better equipped to change with it.
  • Government officials who knew Nierenberg were not surprised by his conclusions: He was an optimist by training and experience, a devout believer in the doctrine of American exceptionalism, one of the elite class of scientists who had helped the nation win a global war, invent the most deadly weapon conceivable and create the booming aerospace and computer industries. America had solved every existential problem it had confronted over the previous generation; it would not be daunted by an excess of carbon dioxide. Nierenberg had also served on Reagan’s transition team. Nobody believed that he had been directly influenced by his political connections, but his views — optimistic about the saving graces of market forces, pessimistic about the value of government regulation — reflected all the ardor of his party.
  • That’s what Nierenberg wrote in “Changing Climate.” But it’s not what he said in the press interviews that followed. He argued the opposite: There was no urgent need for action. The public should not entertain the most “extreme negative speculations” about climate change (despite the fact that many of those speculations appeared in his report). Though “Changing Climate” urged an accelerated transition to renewable fuels, noting that it would take thousands of years for the atmosphere to recover from the damage of the last century, Nierenberg recommended “caution, not panic.” Better to wait and see
  • The damage of “Changing Climate” was squared by the amount of attention it received. Nierenberg’s speech in the Great Hall, being one-500th the length of the actual assessment, received 500 times the press coverage. As The Wall Street Journal put it, in a line echoed by trade journals across the nation: “A panel of top scientists has some advice for people worried about the much-publicized warming of the Earth’s climate: You can cope.”
  • On “CBS Evening News,” Dan Rather said the academy had given “a cold shoulder” to a grim, 200-page E.P.A. assessment published earlier that week (titled “Can We Delay a Greenhouse Warming?”; the E.P.A.’s answer, reduced to a word, was no). The Washington Post described the two reports, taken together, as “clarion calls to inaction.
  • George Keyworth II, Reagan’s science adviser. Keyworth used Nierenberg’s optimism as reason to discount the E.P.A.’s “unwarranted and unnecessarily alarmist” report and warned against taking any “near-term corrective action” on global warming. Just in case it wasn’t clear, Keyworth added, “there are no actions recommended other than continued research.”
  • Edward David Jr., two years removed from boasting of Exxon’s commitment to transforming global energy policy, told Science that the corporation had reconsidered. “Exxon has reverted to being mainly a supplier of conventional hydrocarbon fuels — petroleum products, natural gas and steam coal,” David said. The American Petroleum Institute canceled its own carbon-dioxide research program, too.
  • Exxon soon revised its position on climate-change research. In a presentation at an industry conference, Henry Shaw cited “Changing Climate” as evidence that “the general consensus is that society has sufficient time to technologically adapt to a CO₂ greenhouse effect.” If the academy had concluded that regulations were not a serious option, why should Exxon protest
  • 2. ‘You Scientists Win’ 1985
  • 3. The Size of The Human Imagination Spring-Summer 1986
  • Curtis Moore’s proposal: Use ozone to revive climate. The ozone hole had a solution — an international treaty, already in negotiation. Why not hitch the milk wagon to the bullet train? Pomerance was skeptical. The problems were related, sure: Without a reduction in CFC emissions, you didn’t have a chance of averting cataclysmic global warming. But it had been difficult enough to explain the carbon issue to politicians and journalists; why complicate the sales pitch? Then again, he didn’t see what choice he had. The Republicans controlled the Senate, and Moore was his connection to the Senate’s environmental committee.
  • Pomerance met with Senator John Chafee, a Republican from Rhode Island, and helped persuade him to hold a double-barreled hearing on the twin problems of ozone and carbon dioxide on June 10 and 11, 1986
  • F.Sherwood Rowland, Robert Watson, a NASA scientist, and Richard Benedick, the administration’s lead representative in international ozone negotiations, would discuss ozone; James Hansen, Al Gore, the ecologist George Woodwell and Carl Wunsch, a veteran of the Charney group, would testify about climate change.
  • As Pomerance had hoped, fear about the ozone layer ensured a bounty of press coverage for the climate-change testimony. But as he had feared, it caused many people to conflate the two crises. One was Peter Jennings, who aired the video on ABC’s “World News Tonight,” warning that the ozone hole “could lead to flooding all over the world, also to drought and to famine.”
  • The confusion helped: For the first time since the “Changing Climate” report, global-warming headlines appeared by the dozen. William Nierenberg’s “caution, not panic” line was inverted. It was all panic without a hint of caution: “A Dire Forecast for ‘Greenhouse’ Earth” (the front page of The Washington Post); “Scientists Predict Catastrophes in Growing Global Heat Wave” (Chicago Tribune); “Swifter Warming of Globe Foreseen” (The New York Times).
  • After three years of backsliding and silence, Pomerance was exhilarated to see interest in the issue spike overnight. Not only that: A solution materialized, and a moral argument was passionately articulated — by Rhode Island’s Republican senator no less. “Ozone depletion and the greenhouse effect can no longer be treated solely as important scientific questions,” Chafee said. “They must be seen as critical problems facing the nations of the world, and they are problems that demand solutions.”
  • The old canard about the need for more research was roundly mocked — by Woodwell, by a W.R.I. colleague named Andrew Maguire, by Senator George Mitchell, a Democrat from Maine. “Scientists are never 100 percent certain,” the Princeton historian Theodore Rabb testified. “That notion of total certainty is something too elusive ever to be sought.” As Pomerance had been saying since 1979, it was past time to act. Only now the argument was so broadly accepted that nobody dared object.
  • The ozone hole, Pomerance realized, had moved the public because, though it was no more visible than global warming, people could be made to see it. They could watch it grow on video. Its metaphors were emotionally wrought: Instead of summoning a glass building that sheltered plants from chilly weather (“Everything seems to flourish in there”), the hole evoked a violent rending of the firmament, inviting deathly radiation. Americans felt that their lives were in danger. An abstract, atmospheric problem had been reduced to the size of the human imagination. It had been made just small enough, and just large enough, to break through.
  • Four years after “Changing Climate,” two years after a hole had torn open the firmament and a month after the United States and more than three dozen other nations signed a treaty to limit use of CFCs, the climate-change corps was ready to celebrate. It had become conventional wisdom that climate change would follow ozone’s trajectory. Reagan’s E.P.A. administrator, Lee M. Thomas, said as much the day he signed the Montreal Protocol on Substances That Deplete the Ozone Layer (the successor to the Vienna Convention), telling reporters that global warming was likely to be the subject of a future international agreement
  • Congress had already begun to consider policy — in 1987 alone, there were eight days of climate hearings, in three committees, across both chambers of Congress; Senator Joe Biden, a Delaware Democrat, had introduced legislation to establish a national climate-change strategy. And so it was that Jim Hansen found himself on Oct. 27 in the not especially distinguished ballroom of the Quality Inn on New Jersey Avenue, a block from the Capitol, at “Preparing for Climate Change,” which was technically a conference but felt more like a wedding.
  • John Topping was an old-line Rockefeller Republican, a Commerce Department lawyer under Nixon and an E.P.A. official under Reagan. He first heard about the climate problem in the halls of the E.P.A. in 1982 and sought out Hansen, who gave him a personal tutorial. Topping was amazed to discover that out of the E.P.A.’s 13,000-person staff, only seven people, by his count, were assigned to work on climate, though he figured it was more important to the long-term security of the nation than every other environmental issue combined.
  • Glancing around the room, Jim Hansen could chart, like an arborist counting rings on a stump, the growth of the climate issue over the decade. Veterans like Gordon MacDonald, George Woodwell and the environmental biologist Stephen Schneider stood at the center of things. Former and current staff members from the congressional science committees (Tom Grumbly, Curtis Moore, Anthony Scoville) made introductions to the congressmen they advised. Hansen’s owlish nemesis Fred Koomanoff was present, as were his counterparts from the Soviet Union and Western Europe. Rafe Pomerance’s cranium could be seen above the crowd, but unusually he was surrounded by colleagues from other environmental organizations that until now had shown little interest in a diffuse problem with no proven fund-raising record. The party’s most conspicuous newcomers, however, the outermost ring, were the oil-and-gas executives.
  • That evening, as a storm spat and coughed outside, Rafe Pomerance gave one of his exhortative speeches urging cooperation among the various factions, and John Chafee and Roger Revelle received awards; introductions were made and business cards earnestly exchanged. Not even a presentation by Hansen of his research could sour the mood. The next night, on Oct. 28, at a high-spirited dinner party in Topping’s townhouse on Capitol Hill, the oil-and-gas men joked with the environmentalists, the trade-group representatives chatted up the regulators and the academics got merrily drunk. Mikhail Budyko, the don of the Soviet climatologists, settled into an extended conversation about global warming with Topping’s 10-year-old son. It all seemed like the start of a grand bargain, a uniting of factions — a solution.
  • Hansen was accustomed to the bureaucratic nuisances that attended testifying before Congress; before a hearing, he had to send his formal statement to NASA headquarters, which forwarded it to the White House’s Office of Management and Budget for approval. “Major greenhouse climate changes are a certainty,” he had written. “By the 2010s [in every scenario], essentially the entire globe has very substantial warming.”
  • By all appearances, plans for major policy continued to advance rapidly. After the Johnston hearing, Timothy Wirth, a freshman Democratic senator from Colorado on the energy committee, began to plan a comprehensive package of climate-change legislation — a New Deal for global warming. Wirth asked a legislative assistant, David Harwood, to consult with experts on the issue, beginning with Rafe Pomerance, in the hope of converting the science of climate change into a new national energy policy.
  • In March 1988, Wirth joined 41 other senators, nearly half of them Republicans, to demand that Reagan call for an international treaty modeled after the ozone agreement. Because the United States and the Soviet Union were the world’s two largest contributors of carbon emissions, responsible for about one-third of the world total, they should lead the negotiations. Reagan agreed. In May, he signed a joint statement with Mikhail Gorbachev that included a pledge to cooperate on global warming.
  • Al Gore himself had, for the moment, withdrawn his political claim to the issue. In 1987, at the age of 39, Gore announced that he was running for president, in part to bring attention to global warming, but he stopped emphasizing it after the subject failed to captivate New Hampshire primary voters.
  • 5. ‘You Will See Things That You Shall Believe’ Summer 1988
  • It was the hottest and driest summer in history. Everywhere you looked, something was bursting into flames. Two million acres in Alaska incinerated, and dozens of major fires scored the West. Yellowstone National Park lost nearly one million acres. Smoke was visible from Chicago, 1,600 miles away.
  • In Nebraska, suffering its worst drought since the Dust Bowl, there were days when every weather station registered temperatures above 100 degrees. The director of the Kansas Department of Health and Environment warned that the drought might be the dawning of a climatic change that within a half century could turn the state into a desert.
  • On June 22 in Washington, where it hit 100 degrees, Rafe Pomerance received a call from Jim Hansen, who was scheduled to testify the following morning at a Senate hearing called by Timothy Wirth. “I hope we have good media coverage tomorrow,” Hansen said.
  • Hansen had just received the most recent global temperature data. Just over halfway into the year, 1988 was setting records. Already it had nearly clinched the hottest year in history. Ahead of schedule, the signal was emerging from the noise. “I’m going to make a pretty strong statement,” Hansen said.
  • Hansen returned to his testimony. He wrote: “The global warming is now large enough that we can ascribe with a high degree of confidence a cause-and-effect relationship to the greenhouse effect.” He wrote: “1988 so far is so much warmer than 1987, that barring a remarkable and improbable cooling, 1988 will be the warmest year on record.” He wrote: “The greenhouse effect has been detected, and it is changing our climate now.”
  • “We have only one planet,” Senator Bennett Johnston intoned. “If we screw it up, we have no place to go.” Senator Max Baucus, a Democrat from Montana, called for the United Nations Environment Program to begin preparing a global remedy to the carbon-dioxide problem. Senator Dale Bumpers, a Democrat of Arkansas, previewed Hansen’s testimony, saying that it “ought to be cause for headlines in every newspaper in America tomorrow morning.” The coverage, Bumpers emphasized, was a necessary precursor to policy. “Nobody wants to take on any of the industries that produce the things that we throw up into the atmosphere,” he said. “But what you have are all these competing interests pitted against our very survival.”
  • Hansen, wiping his brow, spoke without affect, his eyes rarely rising from his notes. The warming trend could be detected “with 99 percent confidence,” he said. “It is changing our climate now.” But he saved his strongest comment for after the hearing, when he was encircled in the hallway by reporters. “It is time to stop waffling so much,” he said, “and say that the evidence is pretty strong that the greenhouse effect is here.”
  • The press followed Bumpers’s advice. Hansen’s testimony prompted headlines in dozens of newspapers across the country, including The New York Times, which announced, across the top of its front page: “Global Warming Has Begun, Expert Tells Senate.”
  • Rafe Pomerance called his allies on Capitol Hill, the young staff members who advised politicians, organized hearings, wrote legislation. We need to finalize a number, he told them, a specific target, in order to move the issue — to turn all this publicity into policy. The Montreal Protocol had called for a 50 percent reduction in CFC emissions by 1998. What was the right target for carbon emissions? It wasn’t enough to exhort nations to do better. That kind of talk might sound noble, but it didn’t change investments or laws. They needed a hard goal — something ambitious but reasonable. And they needed it soon: Just four days after Hansen’s star turn, politicians from 46 nations and more than 300 scientists would convene in Toronto at the World Conference on the Changing Atmosphere, an event described by Philip Shabecoff of The New York Times as “Woodstock for climate change.”
  • Pomerance had a proposal: a 20 percent reduction in carbon emissions by 2000. Ambitious, Harwood said. In all his work planning climate policy, he had seen no assurance that such a steep drop in emissions was possible. Then again, 2000 was more than a decade off, so it allowed for some flexibility.
  • Mintzer pointed out that a 20 percent reduction was consistent with the academic literature on energy efficiency. Various studies over the years had shown that you could improve efficiency in most energy systems by roughly 20 percent if you adopted best practices.
  • Of course, with any target, you had to take into account the fact that the developing world would inevitably consume much larger quantities of fossil fuels by 2000. But those gains could be offset by a wider propagation of the renewable technologies already at hand — solar, wind, geothermal. It was not a rigorous scientific analysis, Mintzer granted, but 20 percent sounded plausible. We wouldn’t need to solve cold fusion or ask Congress to repeal the law of gravity. We could manage it with the knowledge and technology we already had.
  • Besides, Pomerance said, 20 by 2000 sounds good.
  • The conference’s final statement, signed by all 400 scientists and politicians in attendance, repeated the demand with a slight variation: a 20 percent reduction in carbon emissions by 2005. Just like that, Pomerance’s best guess became global diplomatic policy.
  • Hansen, emerging from Anniek’s successful cancer surgery, took it upon himself to start a one-man public information campaign. He gave news conferences and was quoted in seemingly every article about the issue; he even appeared on television with homemade props. Like an entrant at an elementary-school science fair, he made “loaded dice” out of sections of cardboard and colored paper to illustrate the increased likelihood of hotter weather in a warmer climate. Public awareness of the greenhouse effect reached a new high of 68 percent
  • global warming became a major subject of the presidential campaign. While Michael Dukakis proposed tax incentives to encourage domestic oil production and boasted that coal could satisfy the nation’s energy needs for the next three centuries, George Bush took advantage. “I am an environmentalist,” he declared on the shore of Lake Erie, the first stop on a five-state environmental tour that would take him to Boston Harbor, Dukakis’s home turf. “Those who think we are powerless to do anything about the greenhouse effect,” he said, “are forgetting about the White House effect.”
  • His running mate emphasized the ticket’s commitment to the issue at the vice-presidential debate. “The greenhouse effect is an important environmental issue,” Dan Quayle said. “We need to get on with it. And in a George Bush administration, you can bet that we will.”
  • This kind of talk roused the oil-and-gas men. “A lot of people on the Hill see the greenhouse effect as the issue of the 1990s,” a gas lobbyist told Oil & Gas Journal. Before a meeting of oil executives shortly after the “environmentalist” candidate won the election, Representative Dick Cheney, a Wyoming Republican, warned, “It’s going to be very difficult to fend off some kind of gasoline tax.” The coal industry, which had the most to lose from restrictions on carbon emissions, had moved beyond denial to resignation. A spokesman for the National Coal Association acknowledged that the greenhouse effect was no longer “an emerging issue. It is here already, and we’ll be hearing more and more about it.”
  • By the end of the year, 32 climate bills had been introduced in Congress, led by Wirth’s omnibus National Energy Policy Act of 1988. Co-sponsored by 13 Democrats and five Republicans, it established as a national goal an “International Global Agreement on the Atmosphere by 1992,” ordered the Energy Department to submit to Congress a plan to reduce energy use by at least 2 percent a year through 2005 and directed the Congressional Budget Office to calculate the feasibility of a carbon tax. A lawyer for the Senate energy committee told an industry journal that lawmakers were “frightened” by the issue and predicted that Congress would eventually pass significant legislation after Bush took office
  • The other great powers refused to wait. The German Parliament created a special commission on climate change, which concluded that action had to be taken immediately, “irrespective of any need for further research,” and that the Toronto goal was inadequate; it recommended a 30 percent reduction of carbon emissions
  • Margaret Thatcher, who had studied chemistry at Oxford, warned in a speech to the Royal Society that global warming could “greatly exceed the capacity of our natural habitat to cope” and that “the health of the economy and the health of our environment are totally dependent upon each other.”
  • The prime ministers of Canada and Norway called for a binding international treaty on the atmosphere; Sweden’s Parliament went further, announcing a national strategy to stabilize emissions at the 1988 level and eventually imposing a carbon tax
  • the United Nations unanimously endorsed the establishment, by the World Meteorological Organization and the United Nations Environment Program, of an Intergovernmental Panel on Climate Change, composed of scientists and policymakers, to conduct scientific assessments and develop global climate policy.
  • One of the I.P.C.C.’s first sessions to plan an international treaty was hosted by the State Department, 10 days after Bush’s inauguration. James Baker chose the occasion to make his first speech as secretary of state. “We can probably not afford to wait until all of the uncertainties about global climate change have been resolved,” he said. “Time will not make the problem go away.”
  • : On April 14, 1989, a bipartisan group of 24 senators, led by the majority leader, George Mitchell, requested that Bush cut emissions in the United States even before the I.P.C.C.’s working group made its recommendation. “We cannot afford the long lead times associated with a comprehensive global agreement,” the senators wrote. Bush had promised to combat the greenhouse effect with the White House effect. The self-proclaimed environmentalist was now seated in the Oval Office. It was time.
  • 8. ‘You Never Beat The White House’ April 1989
  • After Jim Baker gave his boisterous address to the I.P.C.C. working group at the State Department, he received a visit from John Sununu, Bush’s chief of staff. Leave the science to the scientists, Sununu told Baker. Stay clear of this greenhouse-effect nonsense. You don’t know what you’re talking about. Baker, who had served as Reagan’s chief of staff, didn’t speak about the subject again.
  • despite his reputation as a political wolf, he still thought of himself as a scientist — an “old engineer,” as he was fond of putting it, having earned a Ph.D. in mechanical engineering from M.I.T. decades earlier. He lacked the reflexive deference that so many of his political generation reserved for the class of elite government scientists.
  • Since World War II, he believed, conspiratorial forces had used the imprimatur of scientific knowledge to advance an “anti-growth” doctrine. He reserved particular disdain for Paul Ehrlich’s “The Population Bomb,” which prophesied that hundreds of millions of people would starve to death if the world took no step to curb population growth; the Club of Rome, an organization of European scientists, heads of state and economists, which similarly warned that the world would run out of natural resources; and as recently as the mid-’70s, the hypothesis advanced by some of the nation’s most celebrated scientists — including Carl Sagan, Stephen Schneider and Ichtiaque Rasool — that a new ice age was dawning, thanks to the proliferation of man-made aerosols. All were theories of questionable scientific merit, portending vast, authoritarian remedies to halt economic progress.
  • When Mead talked about “far-reaching” decisions and “long-term consequences,” Sununu heard the marching of jackboots.
  • Sununu had suspected that the greenhouse effect belonged to this nefarious cabal since 1975, when the anthropologist Margaret Mead convened a symposium on the subject at the National Institute of Environmental Health Sciences.
  • While Sununu and Darman reviewed Hansen’s statements, the E.P.A. administrator, William K. Reilly, took a new proposal to the White House. The next meeting of the I.P.C.C.’s working group was scheduled for Geneva the following month, in May; it was the perfect occasion, Reilly argued, to take a stronger stand on climate change. Bush should demand a global treaty to reduce carbon emissions.
  • Sununu wouldn’t budge. He ordered the American delegates not to make any commitment in Geneva. Very soon after that, someone leaked the exchange to the press.
  • A deputy of Jim Baker pulled Reilly aside. He said he had a message from Baker, who had observed Reilly’s infighting with Sununu. “In the long run,” the deputy warned Reilly, “you never beat the White House.”
  • 9. ‘A Form of Science Fraud’ May 1989
  • The cameras followed Hansen and Gore into the marbled hallway. Hansen insisted that he wanted to focus on the science. Gore focused on the politics. “I think they’re scared of the truth,” he said. “They’re scared that Hansen and the other scientists are right and that some dramatic policy changes are going to be needed, and they don’t want to face up to it.”
  • The censorship did more to publicize Hansen’s testimony and the dangers of global warming than anything he could have possibly said. At the White House briefing later that morning, Press Secretary Marlin Fitzwater admitted that Hansen’s statement had been changed. He blamed an official “five levels down from the top” and promised that there would be no retaliation. Hansen, he added, was “an outstanding and distinguished scientist” and was “doing a great job.”
  • 10. The White House Effect Fall 1989
  • The Los Angeles Times called the censorship “an outrageous assault.” The Chicago Tribune said it was the beginning of “a cold war on global warming,” and The New York Times warned that the White House’s “heavy-handed intervention sends the signal that Washington wants to go slow on addressing the greenhouse problem.”
  • Darman went to see Sununu. He didn’t like being accused of censoring scientists. They needed to issue some kind of response. Sununu called Reilly to ask if he had any ideas. We could start, Reilly said, by recommitting to a global climate treaty. The United States was the only Western nation on record as opposing negotiations.
  • Sununu sent a telegram to Geneva endorsing a plan “to develop full international consensus on necessary steps to prepare for a formal treaty-negotiating process. The scope and importance of this issue are so great that it is essential for the U.S. to exercise leadership.”
  • Sununu seethed at any mention of the subject. He had taken it upon himself to study more deeply the greenhouse effect; he would have a rudimentary, one-dimensional general circulation model installed on his personal desktop computer. He decided that the models promoted by Jim Hansen were a lot of bunk. They were horribly imprecise in scale and underestimated the ocean’s ability to mitigate warming. Sununu complained about Hansen to D. Allan Bromley, a nuclear physicist from Yale who, at Sununu’s recommendation, was named Bush’s science adviser. Hansen’s findings were “technical poppycock” that didn’t begin to justify such wild-eyed pronouncements that “the greenhouse effect is here” or that the 1988 heat waves could be attributed to global warming, let alone serve as the basis for national economic policy.
  • When a junior staff member in the Energy Department, in a meeting at the White House with Sununu and Reilly, mentioned an initiative to reduce fossil-fuel use, Sununu interrupted her. “Why in the world would you need to reduce fossil-fuel use?” he asked. “Because of climate change,” the young woman replied. “I don’t want anyone in this administration without a scientific background using ‘climate change’ or ‘global warming’ ever again,” he said. “If you don’t have a technical basis for policy, don’t run around making decisions on the basis of newspaper headlines.” After the meeting, Reilly caught up to the staff member in the hallway. She was shaken. Don’t take it personally, Reilly told her. Sununu might have been looking at you, but that was directed at me.
  • Reilly, for his part, didn’t entirely blame Sununu for Bush’s indecision on the prospect of a climate treaty. The president had never taken a vigorous interest in global warming and was mainly briefed about it by nonscientists. Bush had brought up the subject on the campaign trail, in his speech about the White House effect, after leafing through a briefing booklet for a new issue that might generate some positive press. When Reilly tried in person to persuade him to take action, Bush deferred to Sununu and Baker. Why don’t the three of you work it out, he said. Let me know when you decide
  • Relations between Sununu and Reilly became openly adversarial. Reilly, Sununu thought, was a creature of the environmental lobby. He was trying to impress his friends at the E.P.A. without having a basic grasp of the science himself.
  • Pomerance had the sinking feeling that the momentum of the previous year was beginning to flag. The censoring of Hansen’s testimony and the inexplicably strident opposition from John Sununu were ominous signs. So were the findings of a report Pomerance had commissioned, published in September by the World Resources Institute, tracking global greenhouse-gas emissions. The United States was the largest contributor by far, producing nearly a quarter of the world’s carbon emissions, and its contribution was growing faster than that of every other country. Bush’s indecision, or perhaps inattention, had already managed to delay the negotiation of a global climate treaty until 1990 at the earliest, perhaps even 1991. By then, Pomerance worried, it would be too late.
  • Pomerance tried to be more diplomatic. “The president made a commitment to the American people to deal with global warming,” he told The Washington Post, “and he hasn’t followed it up.” He didn’t want to sound defeated. “There are some good building blocks here,” Pomerance said, and he meant it. The Montreal Protocol on CFCs wasn’t perfect at first, either — it had huge loopholes and weak restrictions. Once in place, however, the restrictions could be tightened. Perhaps the same could happen with climate change. Perhaps. Pomerance was not one for pessimism. As William Reilly told reporters, dutifully defending the official position forced upon him, it was the first time that the United States had formally endorsed the concept of an emissions limit. Pomerance wanted to believe that this was progress.
  • All week in Noordwijk, Becker couldn’t stop talking about what he had seen in Zeeland. After a flood in 1953, when the sea swallowed much of the region, killing more than 2,000 people, the Dutch began to build the Delta Works, a vast concrete-and-steel fortress of movable barriers, dams and sluice gates — a masterpiece of human engineering. The whole system could be locked into place within 90 minutes, defending the land against storm surge. It reduced the country’s exposure to the sea by 700 kilometers, Becker explained. The United States coastline was about 153,000 kilometers long. How long, he asked, was the entire terrestrial coastline? Because the whole world was going to need this. In Zeeland, he said, he had seen the future.
  • Ken Caldeira, a climate scientist at the Carnegie Institution for Science in Stanford, Calif., has a habit of asking new graduate students to name the largest fundamental breakthrough in climate physics since 1979. It’s a trick question. There has been no breakthrough. As with any mature scientific discipline, there is only refinement. The computer models grow more precise; the regional analyses sharpen; estimates solidify into observational data. Where there have been inaccuracies, they have tended to be in the direction of understatement.
  • More carbon has been released into the atmosphere since the final day of the Noordwijk conference, Nov. 7, 1989, than in the entire history of civilization preceding it
  • Despite every action taken since the Charney report — the billions of dollars invested in research, the nonbinding treaties, the investments in renewable energy — the only number that counts, the total quantity of global greenhouse gas emitted per year, has continued its inexorable rise.
  • When it comes to our own nation, which has failed to make any binding commitments whatsoever, the dominant narrative for the last quarter century has concerned the efforts of the fossil-fuel industries to suppress science, confuse public knowledge and bribe politicians.
  • The mustache-twirling depravity of these campaigns has left the impression that the oil-and-gas industry always operated thus; while the Exxon scientists and American Petroleum Institute clerics of the ’70s and ’80s were hardly good Samaritans, they did not start multimillion-dollar disinformation campaigns, pay scientists to distort the truth or try to brainwash children in elementary schools, as their successors would.
  • It was James Hansen’s testimony before Congress in 1988 that, for the first time since the “Changing Climate” report, made oil-and-gas executives begin to consider the issue’s potential to hurt their profits. Exxon, as ever, led the field. Six weeks after Hansen’s testimony, Exxon’s manager of science and strategy development, Duane LeVine, prepared an internal strategy paper urging the company to “emphasize the uncertainty in scientific conclusions.” This shortly became the default position of the entire sector. LeVine, it so happened, served as chairman of the global petroleum industry’s Working Group on Global Climate Change, created the same year, which adopted Exxon’s position as its own
  • The American Petroleum Institute, after holding a series of internal briefings on the subject in the fall and winter of 1988, including one for the chief executives of the dozen or so largest oil companies, took a similar, if slightly more diplomatic, line. It set aside money for carbon-dioxide policy — about $100,000, a fraction of the millions it was spending on the health effects of benzene, but enough to establish a lobbying organization called, in an admirable flourish of newspeak, the Global Climate Coalition.
  • The G.C.C. was conceived as a reactive body, to share news of any proposed regulations, but on a whim, it added a press campaign, to be coordinated mainly by the A.P.I. It gave briefings to politicians known to be friendly to the industry and approached scientists who professed skepticism about global warming. The A.P.I.’s payment for an original op-ed was $2,000.
  • It was joined by the U.S. Chamber of Commerce and 14 other trade associations, including those representing the coal, electric-grid and automobile industries
  • In October 1989, scientists allied with the G.C.C. began to be quoted in national publications, giving an issue that lacked controversy a convenient fulcrum. “Many respected scientists say the available evidence doesn’t warrant the doomsday warnings,” was the caveat that began to appear in articles on climate change.
  • The following year, when President Bill Clinton proposed an energy tax in the hope of meeting the goals of the Rio treaty, the A.P.I. invested $1.8 million in a G.C.C. disinformation campaign. Senate Democrats from oil-and-coal states joined Republicans to defeat the tax proposal, which later contributed to the Republicans’ rout of Democrats in the midterm congressional elections in 1994 — the first time the Republican Party had won control of both houses in 40 years
  • The G.C.C. spent $13 million on a single ad campaign intended to weaken support for the 1997 Kyoto Protocol, which committed its parties to reducing greenhouse-gas emissions by 5 percent relative to 1990 levels. The Senate, which would have had to ratify the agreement, took a pre-emptive vote declaring its opposition; the resolution passed 95-0. There has never been another serious effort to negotiate a binding global climate treaty.
  • . This has made the corporation an especially vulnerable target for the wave of compensatory litigation that began in earnest in the last three years and may last a generation. Tort lawsuits have become possible only in recent years, as scientists have begun more precisely to attribute regional effects to global emission levels. This is one subfield of climate science that has advanced significantly sin
  • Pomerance had not been among the 400 delegates invited to Noordwijk. But together with three young activists — Daniel Becker of the Sierra Club, Alden Meyer of the Union of Concerned Scientists and Stewart Boyle from Friends of the Earth — he had formed his own impromptu delegation. Their constituency, they liked to say, was the climate itself. Their mission was to pressure the delegates to include in the final conference statement, which would be used as the basis for a global treaty, the target proposed in Toronto: a 20 percent reduction of greenhouse-gas combustion by 2005. It was the only measure that mattered, the amount of emissions reductions, and the Toronto number was the strongest global target yet proposed.
  • The delegations would review the progress made by the I.P.C.C. and decide whether to endorse a framework for a global treaty. There was a general sense among the delegates that they would, at minimum, agree to the target proposed by the host, the Dutch environmental minister, more modest than the Toronto number: a freezing of greenhouse-gas emissions at 1990 levels by 2000. Some believed that if the meeting was a success, it would encourage the I.P.C.C. to accelerate its negotiations and reach a decision about a treaty sooner. But at the very least, the world’s environmental ministers should sign a statement endorsing a hard, binding target of emissions reductions. The mood among the delegates was electric, nearly giddy — after more than a decade of fruitless international meetings, they could finally sign an agreement that meant something.
  • 11. ‘The Skunks at The Garden Party’ November 1989
  • It was nearly freezing — Nov. 6, 1989, on the coast of the North Sea in the Dutch resort town of Noordwijk
  • Losing Earth: The Decade WeAlmost Stopped Climate Change We knew everything we needed to know, and nothing stood in our way. Nothing, that is, except ourselves. A tragedy in two acts. By Nathaniel RichPhotographs and Videos by George Steinmetz AUG. 1, 2018
cjlee29

Paul Ryan endorses Donald Trump - The Washington Post - 0 views

  • House Speaker Paul D. Ryan (R-Wis.) ended a month-long holdout by formally backing his party’s presumptive presidential nominee: Donald Trump.
  • On Thursday, the speaker penned a guest column for his hometown newspaper in which he trumpeted the controversial real-estate mogul as someone who could support the speaker’s conservative agenda.
  • Like many senior Republicans, Ryan’s endorsement came with its share of caveats about the speaker and the presumptive nominee’s remaining policy differences
  • ...18 more annotations...
  • id not signal any level of comfort with Trump’s sometimes bombastic style
  • The move marks a big about-face for Ryan, who four weeks ago declared he was “not there yet” in terms of endorsing Trump and questioned whether the controversial businessman was even a conservative
  • Ryan and Trump met once in person in mid-May when the billionaire crisscrossed Capitol Hill for meetings with House and Senate leaders.
  • Throughout the talks, neither side agreed to switch any of their policy positions,
  • endorsement should not be construed as the sort of “real unification” of Republicans
  • Ryan’s move may also signal that the speaker and other top Republicans are worried about keeping the House and Senate in Republican hands come November, and believe the best way to do that is to unite the party.
  • Senate Majority Leader Mitch McConnell (R-Ky.) was among the first top leader to say he would back Trump.
  • Ryan became the last senior Republican congressional leader to throw his weight behind Trump’s candidacy.
  • Ryan’s chief communications adviser, Brendan Buck, said reporters need not mince words to figure out what it all meant.
  • “Paul Ryan in many ways is the antithesis of Donald Trump; he’s everything that Donald Trump is not. He’s a decent human being. He is a conservative. He is steeped in public policy. He cares about ideas. He’s a person who conducts himself with civility and grace in public life. He doesn’t put down his opponents,”
  • Ryan’s column in his local newspaper left no doubt where he stood after speaking “at great length” with Trump since he initially declared hesitation about his candidacy.
  • House Republicans will be inseparably tied to their toxic front-runner in November, case closed,
  • Ryan’s dragged out decision underscores how truly vulnerable Donald Trump makes House Republicans in swing districts
  • at odds with Trump’s positions on key policy planks dear to mainstream Republicans of the past 40 years
  • free trade agenda and the effort to rein in federal spending on entitlements.
  • Those issues were the hallmark of Ryan’s early congressional career and Trump stands squarely against them.
  • Trump’s proposals to ban all Muslim travel into the United States and the candidate’s brusque comments regarding minorities, women and the disabled gave Ryan pause.
  • Those concerns appear to remain, and Ryan vowed to speak out against Trump if he crosses lines again.
Javier E

Why Matthew Yglesias Left Vox - The Atlantic - 0 views

  • Yglesias explained why pushing back against the “dominant sensibility” in digital journalism is important to him. He said he believes that certain voguish positions are substantively wrong—for instance, abolishing or defunding police—and that such arguments, as well as rhetorical fights over terms like Latinx, alienate many people from progressive politics and the Democratic Party.
  • there’s a dynamic where there’s media people who really elevated the profile of [Alexandria Ocasio-Cortez] and a couple of other members way above their actual numerical standing.”
  • “The people making the media are young college graduates in big cities, and that kind of politics makes a lot of sense to them,” he said. “And we keep seeing that older people, and working-class people of all races and ethnicities, just don’t share that entire worldview. It’s important to me to be in a position to step outside that dynamic …
  • ...6 more annotations...
  • One trend that exacerbated that challenge: colleagues in media treating the expression of allegedly problematic ideas as if they were a human-resources issue. Earlier this year, for instance, after Yglesias signed a group letter published in Harper’s magazine objecting to cancel culture, one of his colleagues, Emily VanDerWerff, told Vox editors that his signature made her feel “less safe at Vox.”
  • I asked Yglesias if that matter in any way motivated his departure. “Something we’ve seen in a lot of organizations is increasing sensitivity about language and what people say,” he told me. “It’s a damaging trend in the media in particular because it is an industry that’s about ideas, and if you treat disagreement as a source of harm or personal safety, then it’s very challenging to do good work.”
  • an experiment that the Harvard social scientist Cass Sunstein conducted in two different communities in Colorado: left-leaning Boulder and right-leaning Colorado Springs. Residents in each community were gathered into small groups to discuss their views on three controversial topics: climate change, same-sex marriage, and affirmative action. Afterward, participants were asked to report on the opinions of their discussion group as well as their own views on the subjects. In both communities, gathering into groups composed of mostly like-minded people to discuss controversial subjects made individuals more settled and extreme in their views.
  • “Liberals, in Boulder, became distinctly more liberal on all three issues. Conservatives, in Colorado Springs, become distinctly more conservative on all three issues,” Sunstein wrote of his experiment. “Deliberation much decreased diversity among liberals; it also much decreased diversity among conservatives. After deliberation, members of nearly all groups showed, in their post-deliberation statements, far more uniformity than they did before deliberation.”
  • Compelling evidence points to a big cost associated with ideological bubbles, I argued: They make us more confident that we know everything, more set and extreme in our views, more prone to groupthink, more vulnerable to fallacies, and less circumspect.
  • The New York Times, New York, The Intercept, Vox, Slate, The New Republic, and other outlets are today less ideologically diverse in their staff and less tolerant of contentious challenges to the dominant viewpoint of college-educated progressives than they have been in the recent past. I fear that in the short term, Americans will encounter less rigorous and more polarizing journalism. In the long term, a dearth of ideological diversity risks consequences we cannot fully anticipate.
Javier E

Barry Latzer on Why Crime Rises and Falls - The Atlantic - 0 views

  • Barry Latzer: The optimistic view is that the late ‘60s crime tsunami, which ended in the mid-1990s, was sui generis, and we are now in a period of "permanent peace," with low crime for the foreseeable future
  • Pessimists rely on the late Eric Monkkonen's cyclical theory of crime, which suggests that the successive weakening and strengthening of social controls on violence lead to a crime roller coaster. The current zeitgeist favors a weakening of social controls, including reductions in incarcerative sentences and restrictions on police, on the grounds that the criminal-justice system is too racist, unfair, and expensive. If Monkkonen were correct, we will get a crime rise before long.
  • the most provocative feature of your book: your belief that different cultural groups show different propensities for crime, enduring over time, and that these groups carry these propensities with them when they migrate from place to place.
  • ...21 more annotations...
  • this idea and its implications stir more controversy among criminologists than any other. Would you state your position as precisely as possible in this brief space?
  • Latzer: First of all, culture and race, in the biological or genetic sense, are very different. Were it not for the racism of the 18th and 19th centuries, we might not have had a marked cultural difference between blacks and whites in the U.S. But history cannot be altered, only studied and sometimes deplored. 28 28
  • Different groups of people, insofar as they consider themselves separate from others, share various cultural characteristics: dietary, religious, linguistic, artistic, etc. They also share common beliefs and values. There is nothing terribly controversial about this. If it is mistaken then the entire fields of sociology and anthropology are built on mistaken premises.
  • With respect to violent crime, scholars are most interested in a group's preference for violence as a way of resolving interpersonal conflict. Some groups, traditionally rural, developed cultures of “honor”—strong sensitivities to personal insult. We see this among white and black southerners in the 19th century, and among southern Italian and Mexican immigrants to the U.S. in the early 20th century. These groups engaged in high levels of assaultive crimes in response to perceived slights, mainly victimizing their own kind.
  • This honor culture explains the high rates of violent crime among African Americans who, living amidst southern whites for over a century, incorporated those values. When blacks migrated north in the 20th century, they transported these rates of violence. Elijah Anderson's book, The Code of the Streets, describes the phenomenon, and Thomas Sowell, in Black Liberals and White Rednecks, helps explain it. 28 28
  • Theories of crime that point to poverty and racism have the advantage of explaining why low-income groups predominate when it comes to violent crime. What they really explain, though, is why more affluent groups refrain from such crime. And the answer is that middle-class people (regardless of race) stand to lose a great deal from such behavior.
  • The cultural explanation for violence is superior to explanations that rest of poverty or racism, however, because it can account for the differentials in the violent-crime rates of groups with comparable adversities
  • Frum: Let’s flash forward to the present day. You make short work of most of the theories explaining the crime drop-off since the mid-1990s: the Freakonomics theory that attributes the crime decline to easier access to abortion after 1970; the theory that credits reductions in lead poisoning; and the theory that credits the mid-1990s economic spurt. Why are these ideas wrong? And what would you put in their place? 28 28
  • both the abortion and leaded-gasoline theories are mistaken because of a failure to explain the crime spike that immediately preceded the great downturn. Abortions became freely available starting in the 1970s, which is also when lead was removed from gasoline. Fast-forward 15 to 20 years to the period in which unwanted babies had been removed from the population and were not part of the late adolescent, early adult, cohort. This cohort was responsible for the huge spike in crime in the late 1980s, early 1990s, the crack cocaine crime rise. Why didn't the winnowing through abortion of this population reduce crime?
  • Likewise, the lead removal theory. The same "lead-free" generation that engaged in less crime from 1993 on committed high rates of violent crime between 1987 and 1992.
  • As for economic booms, it is tempting to argue that they reduce crime on the theory that people who have jobs and higher incomes have less incentive to rob and steal. This is true. But violent crimes, such as murder and manslaughter, assault, and rape, are not motivated by pecuniary interests. They are motivated by arguments, often of a seemingly petty nature, desires for sexual conquest by violence in the case of rape, or domestic conflicts, none of which are related to general economic conditions
  • Rises in violent crime have much more to do with migrations of high-crime cultures, especially to locations in which governments, particularly crime-control agents, are weak.
  • Declines are more likely when crime controls are strong, and there are no migrations or demographic changes associated with crime rises
  • In short, the aging of the violent boomer generation followed by the sudden rise and demise of the crack epidemic best explains the crime trough that began in the mid-1990s and seems to be continuing even today.
  • Contrary to leftist claims, strengthened law enforcement played a major role in the crime decline. The strengthening was the result of criminal-justice policy changes demanded by the public, black and white, and was necessitated by the weakness of the criminal justice system in the late ‘60s
  • On the other hand, conservatives tend to rely too much on the strength of the criminal-justice system in explaining crime oscillations, which, as I said, have a great to do with migrations and demographics
  • The contemporary challenge is to keep law enforcement strong without alienating African Americans, an especially difficult proposition given the outsized violent-crime rates in low-income black communities.
  • Frum: The sad exception to the downward trend in crime since 1990 is the apparent increase in mass shootings
  • Should such attacks be included in our thinking about crime? If so, how should we think about them? 28 28
  • If we separate out the ideologically motivated mass killings, such as Orlando (apparently) and San Bernardino, then we have a different problem. Surveilling potential killers who share a violent ideology will be extremely difficult but worthwhile. Limiting the availability of rapid-fire weapons with high-capacity ammunition clips is also worth doing, but politically divisive.
  • of course, developments abroad will affect the number of incidents, as will the copycat effect in the immediate aftermath of an incident. This is a complex problem, different from ordinary killings, which, by the way, take many more lives.
Alex Trudel

Europe Is Spying on You - The New York Times - 0 views

  • STRASBOURG, France — When Edward Snowden disclosed details of America’s huge surveillance program two years ago, many in Europe thought that the response would be increased transparency and stronger oversight of security services. European countries, however, are moving in the opposite direction. Instead of more public scrutiny, we are getting more snooping.
  • France recently adopted a controversial law on surveillance that permits major intrusions, without prior judicial authorization, into the private lives of suspects and those who communicate with them, live or work in the same place or even just happen to be near them.
  • Meanwhile, Austria is set to discuss a draft law that would allow a new security agency to operate with reduced external control and to collect and store communication data for up to six years. The Netherlands is considering legislation allowing dragnet surveillance of all telecommunications, indiscriminate gathering of metadata, decryption and intrusion into the computers of non-suspects. And in Finland, the government is even considering changing the Constitution to weaken privacy protections in order to ease the adoption of a bill granting the military and intelligence services the power to conduct electronic mass surveillance with little oversight.
  • ...5 more annotations...
  • More recently, as new technologies have offered more avenues to increase surveillance and data collection, the court has reiterated its position in a number of leading cases against several countries, including France, Romania, Russia and Britain, condemned for having infringed the right to private and family life that in the interpretation of the cour
  • unnecessary “wide-ranging and particularly serious interference with the fundamental right to respect for private life” and personal data, this court reaffirmed the outstanding place privacy holds in Europe
  • If European governments and parliaments do not respect fundamental principles and judicial obligations, our lives will become much less private. Our ability to participate effectively in public life is threatened, too, because these measures curtail our freedom of speech and our right to receive information — including that of public interest. Not all whistleblowers have the technical knowledge Mr. Snowden possessed. Many would fear discovery if they communicated with journalists, who in turn would lose valuable sources, jeopardizing their ability to reveal unlawful conduct in both the public and private spheres. Watergates can only happen when whistleblowers feel protected.
  • First, legislation should limit surveillance and the use of data in a way that strictly respects the right to privacy as spelled out in the Universal Declaration of Human Rights, the International Covenant on Civil and Political Rights, European data protection standards, the case law of the European Court of Human Rights and that of the European Court of Justice. These norms oblige states to respect human rights when they gather and store information relating to our private lives and to protect individuals from unlawful surveillance, including when carried out by foreign agencies.
  • Third, security agencies must operate under independent scrutiny and judicial review. This will require intrusive oversight powers for parliaments and a judiciary that is involved in the decision-making process to ensure accountability. Countries that have adopted controversial surveillance laws should reconsider or amend them. And those considering new surveillance legislation should do so with great caution.
Javier E

Ex-KGB Agent Says Trump Was a Russian Asset. Does it Matter? - 0 views

  • If something like the most sinister plausible story turned out to be true, how much would it matter? Probably not that much
  • I have merely come to think that even if we could have confirmed the worst, to the point that even Trump’s supporters could no longer deny it, it wouldn’t have changed very much. Trump wouldn’t have been forced to resign, and his Republican supporters would not have had to repudiate him. The controversy would have simply receded into the vast landscape of partisan talking points — one more thing liberals mock Trump over, and conservatives complain about the media for covering instead of Nancy Pelosi’s freezer or antifa or the latest campus outrage.
  • One reason I think that is because a great deal of incriminating information was confirmed and very little in fact changed as a result. In 2018, Buzzfeed reported, and the next year Robert Mueller confirmed, explosive details of a Russian kompromat operation. During the campaign, Russia had been dangling a Moscow building deal that stood to give hundreds of millions of dollars in profit to Trump, at no risk. Not only did he stand to gain this windfall, but he was lying in public at the time about his dealings with Russia, which gave Vladimir Putin additional leverage over him. (Russia could expose Trump’s lies at any time if he did something to displease Moscow.)
  • ...14 more annotations...
  • The truth, I suspect, was simultaneously about as bad as I suspected, and paradoxically anticlimactic. Trump was surrounded by all sorts of odious characters who manipulated him into saying and doing things that ran against the national interest. One of those characters was Putin. In the end, their influence ran up against the limits that the character over whom they had gained influence was a weak, failed president.
  • Mueller even testified that this arrangement gave Russia blackmail leverage over Trump. But by the time these facts had passed from the realm of the mysterious to the confirmed, they had become uninteresting.
  • Ultimately, whatever value Trump offered to Russia was compromised by his incompetence and limited ability to grasp firm control even of his own government’s foreign policy. It was not just the fabled “deep state” that undermined Trump. Even his own handpicked appointees constantly undermined him, especially on Russia. Whatever leverage Putin had was limited to a single individual, which meant there was nobody Trump could find to run the State Department, National Security Agency, and so on who shared his idiosyncratic Russophilia.
  • Shvets told Unger that the KGB cultivated Trump as an American leader, and persuaded him to run his ad attacking American alliances. “The ad was assessed by the active measures directorate as one of the most successful KGB operations at that time,” he said, “It was a big thing — to have three major American newspapers publish KGB soundbites.”
  • To be clear, while Shvets is a credible source, his testimony isn’t dispositive. There are any number of possible motives for a former Soviet spy turned critic of Russia’s regime to manufacture an indictment of Trump
  • This is what intelligence experts mean when they describe Trump as a Russian “asset.” It’s not the same as being an agent. An asset is somebody who can be manipulated, as opposed to somebody who is consciously and secretly working on your behalf.
  • A second reason is that reporter Craig Unger got a former KGB spy to confirm on the record that Russian intelligence had been working Trump for decades. In his new book, “American Kompromat,” Unger interviewed Yuri Shvets, who told him that the KGB manipulated Trump with simple flattery. “In terms of his personality, the guy is not a complicated cookie,” he said, “his most important characteristics being low intellect coupled with hyperinflated vanity. This makes him a dream for an experienced recruiter.”
  • If I had to guess today, I’d put the odds higher, perhaps over 50 percent. One reason for my higher confidence is that Trump has continued to fuel suspicion by taking anomalously pro-Russian positions. He met with Putin in Helsinki, appearing strangely submissive, and spouted Putin’s propaganda on a number of topics including the ridiculous possibility of a joint Russian-American cybersecurity unit. (Russia, of course, committed the gravest cyber-hack in American history not long ago, making Trump’s idea even more self-defeating in retrospect than it was at the time.) He seemed to go out of his way to alienate American allies and blow up cooperation every time they met during his tenure.
  • He would either refuse to admit Russian wrongdoing — Trump refused even to concede that the regime poisoned Alexei Navalny — or repeat bizarre snippets of Russian propaganda: NATO was a bad deal for America because Montenegro might launch an attack on Russia; the Soviets had to invade Afghanistan in the 1970s to defend against terrorism. These weren’t talking points he would pick up in his normal routine of watching Fox News and calling Republican sycophants.
  • there was a reasonable chance — I loosely pegged it at 10 or 20 percent — that the Soviets had planted some of these thoughts, which he had never expressed before the trip, in his head.
  • Trump returned from Moscow fired up with political ambition. He began the first of a long series of presidential flirtations, which included a flashy trip to New Hampshire. Two months after his Moscow visit, Trump spent almost $100,000 on a series of full-page newspaper ads that published a political manifesto. “An open letter from Donald J. Trump on why America should stop paying to defend countries that can afford to defend themselves,” as Trump labeled it, launched angry populist charges against the allies that benefited from the umbrella of American military protection. “Why are these nations not paying the United States for the human lives and billions of dollars we are losing to protect their interests?”
  • During the Soviet era, Russian intelligence cast a wide net to gain leverage over influential figures abroad. (The practice continues to this day.) The Russians would lure or entrap not only prominent politicians and cultural leaders, but also people whom they saw as having the potential for gaining prominence in the future. In 1986, Soviet ambassador Yuri Dubinin met Trump in New York, flattered him with praise for his building exploits, and invited him to discuss a building in Moscow. Trump visited Moscow in July 1987. He stayed at the National Hotel, in the Lenin Suite, which certainly would have been bugged. There is not much else in the public record to describe his visit, except Trump’s own recollection in The Art of the Deal that Soviet officials were eager for him to build a hotel there. (It never happened.)
  • In 2018, I became either famous or notorious — depending on your point of view — for writing a story speculating that Russia had secret leverage over Trump
  • Here is what I wrote in that controversial section:
katherineharron

Kim Reynolds, Iowa governor, signs controversial law shortening early and Election Day ... - 0 views

  • Republican Iowa Gov. Kim Reynolds on Monday signed into law a controversial bill aimed at limiting voting and making it harder for voters to return absentee ballots, her office announced Monday.
  • The legislation, which passed both Republican-controlled chambers of the state legislature last month, will reduce the number of early voting days from 29 days to 20 days.
  • It will also close polling places an hour earlier on Election Day (at 8 p.m. instead of 9 p.m.).
  • ...4 more annotations...
  • The bill additionally places new restrictions on absentee voting including banning officials from sending applications without a voter first requesting one and requiring ballots be received by the county before polls close on Election Day.
  • "It's our duty and responsibility to protect the integrity of every election. This legislation strengthens uniformity by providing Iowa's election officials with consistent parameters for Election Day, absentee voting, database maintenance, as well as a clear appeals process for local county auditors," Reynolds said in a statement Monday.
  • The new law drew immediate backlash from Democrats in the state, including a tweet from the Iowa Democratic party stating, "We deserve better."
  • Democratic election attorney Marc Elias similarly called the law "the first major suppression law since the 2020 election" in a tweet and noted that litigation could be forthcoming.
Javier E

Opinion | The 1619 Chronicles - The New York Times - 0 views

  • The 1619 Project introduced a date, previously obscure to most Americans, that ought always to have been thought of as seminal — and probably now will. It offered fresh reminders of the extent to which Black freedom was a victory gained by courageous Black Americans, and not just a gift obtained from benevolent whites.
  • in a point missed by many of the 1619 Project’s critics, it does not reject American values. As Nikole Hannah-Jones, its creator and leading voice, concluded in her essay for the project, “I wish, now, that I could go back to the younger me and tell her that her people’s ancestry started here, on these lands, and to boldly, proudly, draw the stars and those stripes of the American flag.” It’s an unabashedly patriotic thought.
  • ambition can be double-edged. Journalists are, most often, in the business of writing the first rough draft of history, not trying to have the last word on it. We are best when we try to tell truths with a lowercase t, following evidence in directions unseen, not the capital-T truth of a pre-established narrative in which inconvenient facts get discarded
  • ...25 more annotations...
  • on these points — and for all of its virtues, buzz, spinoffs and a Pulitzer Prize — the 1619 Project has failed.
  • That doesn’t mean that the project seeks to erase the Declaration of Independence from history. But it does mean that it seeks to dethrone the Fourth of July by treating American history as a story of Black struggle against white supremacy — of which the Declaration is, for all of its high-flown rhetoric, supposed to be merely a part.
  • he deleted assertions went to the core of the project’s most controversial goal, “to reframe American history by considering what it would mean to regard 1619 as our nation’s birth year.”
  • She then challenged me to find any instance in which the project stated that “using 1776 as our country’s birth date is wrong,” that it “should not be taught to schoolchildren,” and that the only one “that should be taught” was 1619. “Good luck unearthing any of us arguing that,” she added.
  • I emailed her to ask if she could point to any instances before this controversy in which she had acknowledged that her claims about 1619 as “our true founding” had been merely metaphorical. Her answer was that the idea of treating the 1619 date metaphorically should have been so obvious that it went without saying.
  • “1619. It is not a year that most Americans know as a notable date in our country’s history. Those who do are at most a tiny fraction of those who can tell you that 1776 is the year of our nation’s birth. What if, however, we were to tell you that this fact, which is taught in our schools and unanimously celebrated every Fourth of July, is wrong, and that the country’s true birth date, the moment that its defining contradictions first came into the world, was in late August of 1619?”
  • Here is an excerpt from the introductory essay to the project by The New York Times Magazine’s editor, Jake Silverstein, as it appeared in print in August 2019 (italics added):
  • In his introduction, Silverstein argues that America’s “defining contradictions” were born in August 1619, when a ship carrying 20 to 30 enslaved Africans from what is present-day Angola arrived in Point Comfort, in the English colony of Virginia. And the title page of Hannah-Jones’s essay for the project insists that “our founding ideals of liberty and equality were false when they were written.”
  • What was surprising was that in 1776 a politically formidable “defining contradiction” — “that all men are created equal” — came into existence through the Declaration of Independence. As Abraham Lincoln wrote in 1859, that foundational document would forever serve as a “rebuke and stumbling block to the very harbingers of reappearing tyranny and oppression.”
  • As for the notion that the Declaration’s principles were “false” in 1776, ideals aren’t false merely because they are unrealized, much less because many of the men who championed them, and the nation they created, hypocritically failed to live up to them.
  • These two flaws led to a third, conceptual, error. “Out of slavery — and the anti-Black racism it required — grew nearly everything that has truly made America exceptional,” writes Silverstein.
  • Nearly everything? What about, say, the ideas contained by the First Amendment? Or the spirit of openness that brought millions of immigrants through places like Ellis Island? Or the enlightened worldview of the Marshall Plan and the Berlin airlift? Or the spirit of scientific genius and discovery exemplified by the polio vaccine and the moon landing?
  • On the opposite side of the moral ledger, to what extent does anti-Black racism figure in American disgraces such as the brutalization of Native Americans, the Chinese Exclusion Act or the internment of Japanese-Americans in World War II?
  • The world is complex. So are people and their motives. The job of journalism is to take account of that complexity, not simplify it out of existence through the adoption of some ideological orthodoxy.
  • This mistake goes far to explain the 1619 Project’s subsequent scholarly and journalistic entanglements. It should have been enough to make strong yet nuanced claims about the role of slavery and racism in American history. Instead, it issued categorical and totalizing assertions that are difficult to defend on close examination.
  • It should have been enough for the project to serve as curator for a range of erudite and interesting voices, with ample room for contrary takes. Instead, virtually every writer in the project seems to sing from the same song sheet, alienating other potential supporters of the project and polarizing national debate.
  • James McPherson, the Pulitzer Prize-winning author of “Battle Cry of Freedom” and a past president of the American Historical Association. He was withering: “Almost from the outset,” McPherson told the World Socialist Web Site, “I was disturbed by what seemed like a very unbalanced, one-sided account, which lacked context and perspective.”
  • In particular, McPherson objected to Hannah-Jones’s suggestion that the struggle against slavery and racism and for civil rights and democracy was, if not exclusively then mostly, a Black one. As she wrote in her essay: “The truth is that as much democracy as this nation has today, it has been borne on the backs of Black resistance.”
  • McPherson demurs: “From the Quakers in the 18th century, on through the abolitionists in the antebellum, to the Radical Republicans in the Civil War and Reconstruction, to the N.A.A.C.P., which was an interracial organization founded in 1909, down through the civil rights movements of the 1950s and 1960s, there have been a lot of whites who have fought against slavery and racial discrimination, and against racism,” he said. “And that’s what’s missing from this perspective.”
  • Wilentz’s catalog of the project’s mistakes is extensive. Hannah-Jones’s essay claimed that by 1776 Britain was “deeply conflicted” over its role in slavery. But despite the landmark Somerset v. Stewart court ruling in 1772, which held that slavery was not supported by English common law, it remained deeply embedded in the practices of the British Empire. The essay claimed that, among Londoners, “there were growing calls to abolish the slave trade” by 1776. But the movement to abolish the British slave trade only began about a decade later — inspired, in part, Wilentz notes, by American antislavery agitation that had started in the 1760s and 1770s.
  • ie M. Harris, an expert on pre-Civil War African-American life and slavery. “On Aug. 19 of last year,” Harris wrote, “I listened in stunned silence as Nikole Hannah-Jones … repeated an idea that I had vigorously argued against with her fact checker: that the patriots fought the American Revolution in large part to preserve slavery in North America.”
  • The larger problem is that The Times’s editors, however much background reading they might have done, are not in a position to adjudicate historical disputes. That should have been an additional reason for the 1619 Project to seek input from, and include contributions by, an intellectually diverse range of scholarly voices. Yet not only does the project choose a side, it also brooks no doubt.
  • “It is finally time to tell our story truthfully,” the magazine declares on its 1619 cover page. Finally? Truthfully? Is The Times suggesting that distinguished historians, like the ones who have seriously disputed aspects of the project, had previously been telling half-truths or falsehoods?
  • unlike other dates, 1776 uniquely marries letter and spirit, politics and principle: The declaration that something new is born, combined with the expression of an ideal that — because we continue to believe in it even as we struggle to live up to it — binds us to the date.
  • On the other, the 1619 Project has become, partly by its design and partly because of avoidable mistakes, a focal point of the kind of intense national debate that columnists are supposed to cover, and that is being widely written about outside The Times. To avoid writing about it on account of the first scruple is to be derelict in our responsibility toward the second.
katherineharron

Brexit: What is the Withdrawal Agreement Bill and why is it so controversial? - CNN - 0 views

  • We're in the Brexit endgame -- or so Boris Johnson hopes.
  • By Thursday evening, the British Prime Minister intends to have done the seemingly impossible and passed a Brexit deal.But whether he is able to do that depends on a series of crucial votes by lawmakers on his Withdrawal Agreement Bill (WAB).
  • But the government's efforts to force it through in three days are proving controversial, and the bill could be picked apart and reshaped by lawmakers even if it succeeds in its initial vote on Tuesday.
  • ...7 more annotations...
  • The deal looked similar to the one previously negotiated by his predecessor Theresa May, with one big difference: Johnson's pact strips out May's hated Northern Irish backstop mechanism for a customs border in the Irish sea -- something May said she would never agree to.
  • Johnson is desperate to stick to his promise that the UK will leave the EU on October 31, but he can only achieve that with a rapidly accelerated timetable.
  • It's looking like the Prime Minister could squeak the WAB through Parliament on Tuesday evening.A CNN analysis showed that he could win by around three to five votes, with just enough Labour rebels and independents joining his side.
  • Ken Clarke, a former Conservative minister turned independent whose vote could be crucial, added: "If the Government is for some reason insistent on dashing for this completely silly and irrelevant date which it keeps staking its fate on then give some proper time for debate. Two-and-a-bit days of ordinary parliamentary hours is plainly quite insufficient."
  • The Labour Party is arguing that Johnson is running from proper parliamentary scrutiny. Its official position is to vote down both the bill and his timetable, but rebel MPs within the party could swing the votes towards Johnson.
  • But that timeframe for such a lengthy and significant bill is causing anger on the opposition backbenches."Issues like this need to be properly debated not rushed through. Government is storing up very serious future problems by the way it is trying to implement this," Labour MP Yvette Cooper tweeted.
  • Alternatively, the Prime Minister could abandon the legislation altogether and seek a general election in an effort to resolve the mess.
Javier E

Elon Musk's 'anti-woke' Grok AI is disappointing his right-wing fans - The Washington Post - 0 views

  • Decrying what he saw as the liberal bias of ChatGPT, Elon Musk earlier this year announced plans to create an artificial intelligence chatbot of his own. In contrast to AI tools built by OpenAI, Microsoft and Google, which are trained to tread lightly around controversial topics, Musk’s would be edgy, unfiltered and anti-“woke,” meaning it wouldn’t hesitate to give politically incorrect responses.
  • Musk is fielding complaints from the political right that the chatbot gives liberal responses to questions about diversity programs, transgender rights and inequality.
  • “I’ve been using Grok as well as ChatGPT a lot as research assistants,” posted Jordan Peterson, the socially conservative psychologist and YouTube personality, Wednesday. The former is “near as woke as the latter,” he said.
  • ...8 more annotations...
  • The gripe drew a chagrined reply from Musk. “Unfortunately, the Internet (on which it is trained), is overrun with woke nonsense,” he responded. “Grok will get better. This is just the beta.”
  • While many tech ethicists and AI experts warn that these systems can absorb and reinforce harmful stereotypes, efforts by tech firms to counter those tendencies have provoked a backlash from some on the right who see them as overly censorial.
  • “I think both ChatGPT and Grok have probably been trained on similar Internet-derived corpora, so the similarity of responses should perhaps not be too surprising,”
  • So far, however, the people most offended by Grok’s answers seem to be the people who were counting on it to readily disparage minorities, vaccines and President Biden.
  • an academic researcher from New Zealand who examines AI bias, gained attention for a paper published in March that found ChatGPT’s responses to political questions tended to lean moderately left and socially libertarian. Recently, he subjected Grok to some of the same tests and found that its answers to political orientation tests were broadly similar to those of ChatGPT.
  • Touting xAI to former Fox News host Tucker Carlson in April, Musk accused OpenAI’s programmers of “training the AI to lie” or to refrain from commenting when asked about sensitive issues. (OpenAI wrote in a February blog post that its goal is not for the AI to lie, but for it to avoid favoring any one political group or taking positions on controversial topics.) Musk said his AI, in contrast, would be “a maximum truth-seeking AI,” even if that meant offending people.
  • Other AI researchers argue that the sort of political orientation tests used by Rozado overlook ways in which chatbots, including ChatGPT, often exhibit negative stereotypes about marginalized groups.
  • Musk and X did not respond to requests for comment as to what actions they’re taking to alter Grok’s politics, or whether that amounts to putting a thumb on the scale in much the same way Musk has accused OpenAI of doing with ChatGPT.
1 - 20 of 84 Next › Last »
Showing 20 items per page