Skip to main content

Home/ History Readings/ Group items tagged public service

Rss Feed Group items tagged

Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

How Public Health Took Part in Its Own Downfall - The Atlantic - 0 views

  • when the coronavirus pandemic reached the United States, it found a public-health system in disrepair. That system, with its overstretched staff, meager budgets, crumbling buildings, and archaic equipment, could barely cope with sickness as usual, let alone with a new, fast-spreading virus.
  • By one telling, public health was a victim of its own success, its value shrouded by the complacency of good health
  • By a different account, the competing field of medicine actively suppressed public health, which threatened the financial model of treating illness in (insured) individuals
  • ...27 more annotations...
  • In fact, “public health has actively participated in its own marginalization,” Daniel Goldberg, a historian of medicine at the University of Colorado, told me. As the 20th century progressed, the field moved away from the idea that social reforms were a necessary part of preventing disease and willingly silenced its own political voice. By swimming along with the changing currents of American ideology, it drowned many of the qualities that made it most effective.
  • Germ theory offered a seductive new vision for defeating disease: Although the old public health “sought the sources of infectious disease in the surroundings of man; the new finds them in man himself,” wrote Hibbert Hill in The New Public Health in 1913
  • “They didn’t have to think of themselves as activists,” Rosner said. “It was so much easier to identify individual victims of disease and cure them than it was to rebuild a city.”
  • As public health moved into the laboratory, a narrow set of professionals associated with new academic schools began to dominate the once-broad field. “It was a way of consolidating power: If you don’t have a degree in public health, you’re not public health,”
  • Mastering the new science of bacteriology “became an ideological marker,” sharply differentiating an old generation of amateurs from a new one of scientifically minded professionals,
  • Hospitals, meanwhile, were becoming the centerpieces of American health care, and medicine was quickly amassing money and prestige by reorienting toward biomedical research
  • Public health began to self-identify as a field of objective, outside observers of society instead of agents of social change. It assumed a narrower set of responsibilities that included data collection, diagnostic services for clinicians, disease tracing, and health education.
  • Assuming that its science could speak for itself, the field pulled away from allies such as labor unions, housing reformers, and social-welfare organizations that had supported city-scale sanitation projects, workplace reforms, and other ambitious public-health projects.
  • That left public health in a precarious position—still in medicine’s shadow, but without the political base “that had been the source of its power,”
  • After World War II, biomedicine lived up to its promise, and American ideology turned strongly toward individualism.
  • Seeing poor health as a matter of personal irresponsibility rather than of societal rot became natural.
  • Even public health began to treat people as if they lived in a social vacuum. Epidemiologists now searched for “risk factors,” such as inactivity and alcohol consumption, that made individuals more vulnerable to disease and designed health-promotion campaigns that exhorted people to change their behaviors, tying health to willpower in a way that persists today.
  • Public health is now trapped in an unenviable bind. “If it conceives of itself too narrowly, it will be accused of lacking vision … If it conceives of itself too expansively, it will be accused of overreaching,
  • “epidemiology isn’t a field of activists saying, ‘God, asbestos is terrible,’ but of scientists calculating the statistical probability of someone’s death being due to this exposure or that one.”
  • In 1971, Paul Cornely, then the president of the APHA and the first Black American to earn a Ph.D. in public health, said that “if the health organizations of this country have any concern about the quality of life of its citizens, they would come out of their sterile and scientific atmosphere and jump in the polluted waters of the real world where action is the basis for survival.”
  • a new wave of “social epidemiologists” once again turned their attention to racism, poverty, and other structural problems.
  • The biomedical view of health still dominates, as evidenced by the Biden administration’s focus on vaccines at the expense of masks, rapid tests, and other “nonpharmaceutical interventions.”
  • Public health has often been represented by leaders with backgrounds primarily in clinical medicine, who have repeatedly cast the pandemic in individualist terms: “Your health is in your own hands,” said the CDC’s director, Rochelle Walensky, in May
  • the pandemic has proved what public health’s practitioners understood well in the late 19th and early 20th century: how important the social side of health is. People can’t isolate themselves if they work low-income jobs with no paid sick leave, or if they live in crowded housing or prisons.
  • This approach appealed, too, to powerful industries with an interest in highlighting individual failings rather than the dangers of their products.
  • “Public health gains credibility from its adherence to science, and if it strays too far into political advocacy, it may lose the appearance of objectivity,”
  • In truth, public health is inescapably political, not least because it “has to make decisions in the face of rapidly evolving and contested evidence,” Fairchild told me. That evidence almost never speaks for itself, which means the decisions that arise from it must be grounded in values.
  • Those values, Fairchild said, should include equity and the prevention of harm to others, “but in our history, we lost the ability to claim these ethical principles.”
  • “Sick-leave policies, health-insurance coverage, the importance of housing … these things are outside the ability of public health to implement, but we should raise our voices about them,” said Mary Bassett, of Harvard, who was recently appointed as New York’s health commissioner. “I think we can get explicit.”
  • The future might lie in reviving the past, and reopening the umbrella of public health to encompass people without a formal degree or a job at a health department.
  • What if, instead, we thought of the Black Lives Matter movement as a public-health movement, the American Rescue Plan as a public-health bill, or decarceration, as the APHA recently stated, as a public-health goal? In this way of thinking, too, employers who institute policies that protect the health of their workers are themselves public-health advocates.
  • “We need to re-create alliances with others and help them to understand that what they are doing is public health,
Javier E

The nation's public health agencies are ailing when they're needed most - The Washingto... - 0 views

  • At the very moment the United States needed its public health infrastructure the most, many local health departments had all but crumbled, proving ill-equipped to carry out basic functions let alone serve as the last line of defense against the most acute threat to the nation’s health in generations.
  • Epidemiologists, academics and local health officials across the country say the nation’s public health system is one of many weaknesses that continue to leave the United States poorly prepared to handle the coronavirus pandemic
  • That system lacks financial resources. It is losing staff by the day.
  • ...31 more annotations...
  • Even before the pandemic struck, local public health agencies had lost almost a quarter of their overall workforce since 2008 — a reduction of almost 60,000 workers
  • The agencies’ main source of federal funding — the Centers for Disease Control and Prevention’s emergency preparedness budget — had been cut 30 percent since 2003. The Trump administration had proposed slicing even deeper.
  • According to David Himmelstein of the CUNY School of Public Health, global consensus is that, at minimum, 6 percent of a nation’s health spending should be devoted to public health efforts. The United States, he said, has never spent more than half that much.
  • the problems have been left to fester.
  • Delaware County, Pa., a heavily populated Philadelphia suburb, did not even have a public health department when the pandemic struck and had to rely on a neighbor to mount a response.
  • With plunging tax receipts straining local government budgets, public health agencies confront the possibility of further cuts in an economy gutted by the coronavirus. It is happening at a time when health departments are being asked to do more than ever.
  • While the country spends roughly $3.6 trillion every year on health, less than 3 percent of that spending goes to public health and prevention
  • “That’s the way we run much of our public health activity for local health departments. You apply to the CDC, which is the major conduit for federal funding to state and local health departments,” Himmelstein said. “You apply to them for funding for particular functions, and if you don’t get the grant, you don’t have the funding for that.”
  • Compared with Canada, the United Kingdom and northern European countries, the United States — with a less generous social safety net and no universal health care — is investing less in a system that its people rely on more.
  • Himmelstein said that the United States has never placed much emphasis on public health spending but that the investment began to decline even further in the early 2000s. The Great Recession fueled further cuts.
  • Plus, the U.S. public health system relies heavily on federal grants.
  • “Why an ongoing government function should depend on episodic grants rather than consistent funding, I don’t know,” he added. “That would be like seeing that the military is going to apply for a grant for its regular ongoing activities.”
  • Many public health officials say a lack of a national message and approach to the pandemic has undermined their credibility and opened them up to criticism.
  • Few places were less prepared for covid-19’s arrival than Delaware County, Pa., where Republican leaders had decided they did not need a public health department at all
  • “I think the general population didn’t really realize we didn’t have a health department. They just kind of assumed that was one of those government agencies we had,” Taylor said. “Then the pandemic hit, and everyone was like, ‘Wait, hold on — we don’t have a health department? Why don’t we have a health department?’ ”
  • Taylor and other elected officials worked out a deal with neighboring Chester County in which Delaware County paid affluent Chester County’s health department to handle coronavirus operations for both counties for now.
  • One reason health departments are so often neglected is their work focuses on prevention — of outbreaks, sexually transmitted diseases, smoking-related illnesses. Local health departments describe a frustrating cycle: The more successful they are, the less visible problems are and the less funding they receive. Often, that sets the stage for problems to explode again — as infectious diseases often do.
  • It has taken years for many agencies to rebuild budgets and staffing from deep cuts made during the last recessio
  • During the past decade, many local health departments have seen annual rounds of cuts, punctuated with one-time infusions of money following crises such as outbreaks of Zika, Ebola, measles and hepatitis. The problem with that cycle of feast or famine funding is that the short-term money quickly dries up and does nothing to address long-term preparedness.
  • “It’s a silly strategic approach when you think about what’s needed to protect us long term,”
  • She compared the country’s public health system to a house with deep cracks in the foundation. The emergency surges of funding are superficial repairs that leave those cracks unaddressed.
  • “We came into this pandemic at a severe deficit and are still without a strategic goal to build back that infrastructure. We need to learn from our mistakes,”
  • With the economy tanking, the tax bases for cities and counties have shrunken dramatically — payroll taxes, sales taxes, city taxes. Many departments have started cutting staff. Federal grants are no sure thing.
  • 80 percent of counties have reported their budget was affected in the current fiscal year because of the crisis. Prospects are even more dire for future budget periods, when the full impact of reduced tax revenue will become evident.
  • Christine Hahn, medical director for Idaho’s division of public health and a 25-year public health veteran, has seen the state make progress in coronavirus testing and awareness. But like so many public health officials across the country taking local steps to deal with what has become a national problem, she is limited by how much government leaders say she can do and by what citizens are willing to do.
  • “I’ve been through SARS, the 2009 pandemic, the anthrax attacks, and of course I’m in rural Idaho, not New York City and California,” Hahn said. “But I will say this is way beyond anything I’ve ever experienced as far as stress, workload, complexity, frustration, media and public interest, individual citizens really feeling very strongly about what we’re doing and not doing.”
  • At the same time, many countries that invest more in public health infrastructure also provide universal medical coverage that enables them to provide many common public health services as part of their main health-care-delivery system.
  • “People locally are looking to see what’s happening in other states, and we’re constantly having to talk about that and address that,”
  • “I’m mindful of the credibility of our messaging as people say, ‘What about what they’re doing in this place? Why are we not doing what they’re doing?’ ”
  • Many health experts worry the challenges will multiply in the fall with the arrival of flu season.
  • “The unfolding tragedy here is we need people to see local public health officials as heroes in the same way that we laud heart surgeons and emergency room doctors,” Westergaard, the Wisconsin epidemiologist, said. “The work keeps getting higher, and they’re falling behind — and not feeling appreciated by their communities.”
Javier E

When the New York Times lost its way - 0 views

  • There are many reasons for Trump’s ascent, but changes in the American news media played a critical role. Trump’s manipulation and every one of his political lies became more powerful because journalists had forfeited what had always been most valuable about their work: their credibility as arbiters of truth and brokers of ideas, which for more than a century, despite all of journalism’s flaws and failures, had been a bulwark of how Americans govern themselves.
  • I think Sulzberger shares this analysis. In interviews and his own writings, including an essay earlier this year for the Columbia Journalism Review, he has defended “independent journalism”, or, as I understand him, fair-minded, truth-seeking journalism that aspires to be open and objective.
  • It’s good to hear the publisher speak up in defence of such values, some of which have fallen out of fashion not just with journalists at the Times and other mainstream publications but at some of the most prestigious schools of journalism.
  • ...204 more annotations...
  • All the empathy and humility in the world will not mean much against the pressures of intolerance and tribalism without an invaluable quality that Sulzberger did not emphasise: courage.
  • Sulzberger seems to underestimate the struggle he is in, that all journalism and indeed America itself is in
  • In describing the essential qualities of independent journalism in his essay, he unspooled a list of admirable traits – empathy, humility, curiosity and so forth. These qualities have for generations been helpful in contending with the Times’s familiar problem, which is liberal bias
  • on their own, these qualities have no chance against the Times’s new, more dangerous problem, which is in crucial respects the opposite of the old one.
  • The Times’s problem has metastasised from liberal bias to illiberal bias, from an inclination to favour one side of the national debate to an impulse to shut debate down altogether
  • the internet knocked the industry off its foundations. Local newspapers were the proving ground between college campuses and national newsrooms. As they disintegrated, the national news media lost a source of seasoned reporters and many Americans lost a journalism whose truth they could verify with their own eyes.
  • far more than when I set out to become a journalist, doing the work right today demands a particular kind of courage:
  • the moral and intellectual courage to take the other side seriously and to report truths and ideas that your own side demonises for fear they will harm its cause.
  • One of the glories of embracing illiberalism is that, like Trump, you are always right about everything, and so you are justified in shouting disagreement down.
  • leaders of many workplaces and boardrooms across America find that it is so much easier to compromise than to confront – to give a little ground today in the belief you can ultimately bring people around
  • This is how reasonable Republican leaders lost control of their party to Trump and how liberal-minded college presidents lost control of their campuses. And it is why the leadership of the New York Times is losing control of its principles.
  • Over the decades the Times and other mainstream news organisations failed plenty of times to live up to their commitments to integrity and open-mindedness. The relentless struggle against biases and preconceptions, rather than the achievement of a superhuman objective omniscience, is what mattered
  • . I thought, and still think, that no American institution could have a better chance than the Times, by virtue of its principles, its history, its people and its hold on the attention of influential Americans, to lead the resistance to the corruption of political and intellectual life, to overcome the encroaching dogmatism and intolerance.
  • As the country became more polarised, the national media followed the money by serving partisan audiences the versions of reality they preferred
  • This relationship proved self-reinforcing. As Americans became freer to choose among alternative versions of reality, their polarisation intensified.
  • as the top editors let bias creep into certain areas of coverage, such as culture, lifestyle and business, that made the core harder to defend and undermined the authority of even the best reporters.
  • here have been signs the Times is trying to recover the courage of its convictions
  • The paper was slow to display much curiosity about the hard question of the proper medical protocols for trans children; but once it did, the editors defended their coverage against the inevitable criticism.
  • As Sulzberger told me in the past, returning to the old standards will require agonising change. He saw that as the gradual work of many years, but I think he is mistaken. To overcome the cultural and commercial pressures the Times faces, particularly given the severe test posed by another Trump candidacy and possible presidency, its publisher and senior editors will have to be bolder than that.
  • As a Democrat from a family of Democrats, a graduate of Yale and a blossom of the imagined meritocracy, I had my first real chance, at Buchanan’s rallies, to see the world through the eyes of stalwart opponents of abortion, immigration and the relentlessly rising tide of modernity.
  • the Times is failing to face up to one crucial reason: that it has lost faith in Americans, too.
  • For now, to assert that the Times plays by the same rules it always has is to commit a hypocrisy that is transparent to conservatives, dangerous to liberals and bad for the country as a whole.
  • It makes the Times too easy for conservatives to dismiss and too easy for progressives to believe.
  • The reality is that the Times is becoming the publication through which America’s progressive elite talks to itself about an America that does not really exist.
  • It is hard to imagine a path back to saner American politics that does not traverse a common ground of shared fact.
  • It is equally hard to imagine how America’s diversity can continue to be a source of strength, rather than become a fatal flaw, if Americans are afraid or unwilling to listen to each other.
  • I suppose it is also pretty grandiose to think you might help fix all that. But that hope, to me, is what makes journalism worth doing.
  • Since Adolph Ochs bought the paper in 1896, one of the most inspiring things the Times has said about itself is that it does its work “without fear or favour”. That is not true of the institution today – it cannot be, not when its journalists are afraid to trust readers with a mainstream conservative argument such as Cotton’s, and its leaders are afraid to say otherwise.
  • Most important, the Times, probably more than any other American institution, could influence the way society approached debate and engagement with opposing views. If Times Opinion demonstrated the same kind of intellectual courage and curiosity that my colleagues at the Atlantic had shown, I hoped, the rest of the media would follow.
  • You did not have to go along with everything that any tribe said. You did not have to pretend that the good guys, much as you might have respected them, were right about everything, or that the bad guys, much as you might have disdained them, never had a point. You did not, in other words, ever have to lie.
  • This fundamental honesty was vital for readers, because it equipped them to make better, more informed judgments about the world. Sometimes it might shock or upset them by failing to conform to their picture of reality. But it also granted them the respect of acknowledging that they were able to work things out for themselves.
  • The Atlantic did not aspire to the same role as the Times. It did not promise to serve up the news of the day without any bias. But it was to opinion journalism what the Times’s reporting was supposed to be to news: honest and open to the world.
  • Those were the glory days of the blog, and we hit on the idea of creating a living op-ed page, a collective of bloggers with different points of view but a shared intellectual honesty who would argue out the meaning of the news of the day
  • They were brilliant, gutsy writers, and their disagreements were deep enough that I used to joke that my main work as editor was to prevent fistfights.
  • Under its owner, David Bradley, my colleagues and I distilled our purpose as publishing big arguments about big ideas
  • we also began producing some of the most important work in American journalism: Nicholas Carr on whether Google was “making us stupid”; Hanna Rosin on “the end of men”; Taylor Branch on “the shame of college sports”; Ta-Nehisi Coates on “the case for reparations”; Greg Lukianoff and Jonathan Haidt on “the coddling of the American mind”.
  • I was starting to see some effects of the new campus politics within the Atlantic. A promising new editor had created a digital form for aspiring freelancers to fill out, and she wanted to ask them to disclose their racial and sexual identity. Why? Because, she said, if we were to write about the trans community, for example, we would ask a trans person to write the story
  • There was a good argument for that, I acknowledged, and it sometimes might be the right answer. But as I thought about the old people, auto workers and abortion opponents I had learned from, I told her there was also an argument for correspondents who brought an outsider’s ignorance, along with curiosity and empathy, to the story.
  • A journalism that starts out assuming it knows the answers, it seemed to me then, and seems even more so to me now, can be far less valuable to the reader than a journalism that starts out with a humbling awareness that it knows nothing.
  • In the age of the internet it is hard even for a child to sustain an “innocent eye”, but the alternative for journalists remains as dangerous as ever, to become propagandists. America has more than enough of those already.
  • When I looked around the Opinion department, change was not what I perceived. Excellent writers and editors were doing excellent work. But the department’s journalism was consumed with politics and foreign affairs in an era when readers were also fascinated by changes in technology, business, science and culture.
  • Fairly quickly, though, I realised two things: first, that if I did my job as I thought it should be done, and as the Sulzbergers said they wanted me to do it, I would be too polarising internally ever to lead the newsroom; second, that I did not want that job, though no one but my wife believed me when I said that.
  • there was a compensating moral and psychological privilege that came with aspiring to journalistic neutrality and open-mindedness, despised as they might understandably be by partisans. Unlike the duelling politicians and advocates of all kinds, unlike the corporate chieftains and their critics, unlike even the sainted non-profit workers, you did not have to pretend things were simpler than they actually were
  • On the right and left, America’s elites now talk within their tribes, and get angry or contemptuous on those occasions when they happen to overhear the other conclave. If they could be coaxed to agree what they were arguing about, and the rules by which they would argue about it, opinion journalism could serve a foundational need of the democracy by fostering diverse and inclusive debate. Who could be against that?
  • The large staff of op-ed editors contained only a couple of women. Although the 11 columnists were individually admirable, only two of them were women and only one was a person of colour
  • Not only did they all focus on politics and foreign affairs, but during the 2016 campaign, no columnist shared, in broad terms, the worldview of the ascendant progressives of the Democratic Party, incarnated by Bernie Sanders. And only two were conservative.
  • This last fact was of particular concern to the elder Sulzberger. He told me the Times needed more conservative voices, and that its own editorial line had become predictably left-wing. “Too many liberals,” read my notes about the Opinion line-up from a meeting I had with him and Mark Thompson, then the chief executive, as I was preparing to rejoin the paper. “Even conservatives are liberals’ idea of a conservative.” The last note I took from that meeting was: “Can’t ignore 150m conservative Americans.”
  • As I knew from my time at the Atlantic, this kind of structural transformation can be frightening and even infuriating for those understandably proud of things as they are. It is hard on everyone
  • experience at the Atlantic also taught me that pursuing new ways of doing journalism in pursuit of venerable institutional principles created enthusiasm for change. I expected that same dynamic to allay concerns at the Times.
  • If Opinion published a wider range of views, it would help frame a set of shared arguments that corresponded to, and drew upon, the set of shared facts coming from the newsroom.
  • New progressive voices were celebrated within the Times. But in contrast to the Wall Street Journal and the Washington Post, conservative voices – even eloquent anti-Trump conservative voices – were despised, regardless of how many leftists might surround them.
  • The Opinion department mocked the paper’s claim to value diversity. It did not have a single black editor
  • Eventually, it sank in that my snotty joke was actually on me: I was the one ignorantly fighting a battle that was already lost. The old liberal embrace of inclusive debate that reflected the country’s breadth of views had given way to a new intolerance for the opinions of roughly half of American voters.
  • Out of naivety or arrogance, I was slow to recognise that at the Times, unlike at the Atlantic, these values were no longer universally accepted, let alone esteemed
  • After the 9/11 attacks, as the bureau chief in Jerusalem, I spent a lot of time in the Gaza Strip interviewing Hamas leaders, recruiters and foot soldiers, trying to understand and describe their murderous ideology. Some readers complained that I was providing a platform for terrorists, but there was never any objection from within the Times.
  • Our role, we knew, was to help readers understand such threats, and this required empathetic – not sympathetic – reporting. This is not an easy distinction but good reporters make it: they learn to understand and communicate the sources and nature of a toxic ideology without justifying it, much less advocating it.
  • Today’s newsroom turns that moral logic on its head, at least when it comes to fellow Americans. Unlike the views of Hamas, the views of many Americans have come to seem dangerous to engage in the absence of explicit condemnation
  • Focusing on potential perpetrators – “platforming” them by explaining rather than judging their views – is believed to empower them to do more harm.
  • After the profile of the Ohio man was published, media Twitter lit up with attacks on the article as “normalising” Nazism and white nationalism, and the Times convulsed internally. The Times wound up publishing a cringing editor’s note that hung the writer out to dry and approvingly quoted some of the criticism, including a tweet from a Washington Post opinion editor asking, “Instead of long, glowing profiles of Nazis/White nationalists, why don’t we profile the victims of their ideologies”?
  • the Times lacked the confidence to defend its own work
  • The editor’s note paraded the principle of publishing such pieces, saying it was important to “shed more light, not less, on the most extreme corners of American life”. But less light is what the readers got. As a reporter in the newsroom, you’d have to have been an idiot after that explosion to attempt such a profile
  • Empathetic reporting about Trump supporters became even more rare. It became a cliché among influential left-wing columnists and editors that blinkered political reporters interviewed a few Trump supporters in diners and came away suckered into thinking there was something besides racism that could explain anyone’s support for the man.
  • After a year spent publishing editorials attacking Trump and his policies, I thought it would be a demonstration of Timesian open-mindedness to give his supporters their say. Also, I thought the letters were interesting, so I turned over the entire editorial page to the Trump letters.
  • I wasn’t surprised that we got some criticism on Twitter. But I was astonished by the fury of my Times colleagues. I found myself facing an angry internal town hall, trying to justify what to me was an obvious journalistic decision
  • Didn’t he think other Times readers should understand the sources of Trump’s support? Didn’t he also see it was a wonderful thing that some Trump supporters did not just dismiss the Times as fake news, but still believed in it enough to respond thoughtfully to an invitation to share their views?
  • And if the Times could not bear to publish the views of Americans who supported Trump, why should it be surprised that those voters would not trust it?
  • Two years later, in 2020, Baquet acknowledged that in 2016 the Times had failed to take seriously the idea that Trump could become president partly because it failed to send its reporters out into America to listen to voters and understand “the turmoil in the country”. And, he continued, the Times still did not understand the views of many Americans
  • Speaking four months before we published the Cotton op-ed, he said that to argue that the views of such voters should not appear in the Times was “not journalistic”.
  • Conservative arguments in the Opinion pages reliably started uproars within the Times. Sometimes I would hear directly from colleagues who had the grace to confront me with their concerns; more often they would take to the company’s Slack channels or Twitter to advertise their distress in front of each other
  • This environment of enforced group-think, inside and outside the paper, was hard even on liberal opinion writers. One left-of-centre columnist told me that he was reluctant to appear in the New York office for fear of being accosted by colleagues.
  • An internal survey shortly after I left the paper found that barely half the staff, within an enterprise ostensibly devoted to telling the truth, agreed “there is a free exchange of views in this company” and “people are not afraid to say what they really think”.)
  • Even columnists with impeccable leftist bona fides recoiled from tackling subjects when their point of view might depart from progressive orthodoxy.
  • The bias had become so pervasive, even in the senior editing ranks of the newsroom, as to be unconscious
  • Trying to be helpful, one of the top newsroom editors urged me to start attaching trigger warnings to pieces by conservatives. It had not occurred to him how this would stigmatise certain colleagues, or what it would say to the world about the Times’s own bias
  • By their nature, information bubbles are powerfully self-reinforcing, and I think many Times staff have little idea how closed their world has become, or how far they are from fulfilling their compact with readers to show the world “without fear or favour”
  • sometimes the bias was explicit: one newsroom editor told me that, because I was publishing more conservatives, he felt he needed to push his own department further to the left.
  • The Times’s failure to honour its own stated principles of openness to a range of views was particularly hard on the handful of conservative writers, some of whom would complain about being flyspecked and abused by colleagues. One day when I relayed a conservative’s concern about double standards to Sulzberger, he lost his patience. He told me to inform the complaining conservative that that’s just how it was: there was a double standard and he should get used to it.
  • A publication that promises its readers to stand apart from politics should not have different standards for different writers based on their politics. But I delivered the message. There are many things I regret about my tenure as editorial-page editor. That is the only act of which I am ashamed.
  • I began to think of myself not as a benighted veteran on a remote island, but as Rip Van Winkle. I had left one newspaper, had a pleasant dream for ten years, and returned to a place I barely recognised.
  • The new New York Times was the product of two shocks – sudden collapse, and then sudden success. The paper almost went bankrupt during the financial crisis, and the ensuing panic provoked a crisis of confidence among its leaders. Digital competitors like the HuffPost were gaining readers and winning plaudits within the media industry as innovative. They were the cool kids; Times folk were ink-stained wrinklies.
  • In its panic, the Times bought out experienced reporters and editors and began hiring journalists from publications like the HuffPost who were considered “digital natives” because they had never worked in print. This hiring quickly became easier, since most digital publications financed by venture capital turned out to be bad businesses
  • Though they might have lacked deep or varied reporting backgrounds, some of the Times’s new hires brought skills in video and audio; others were practised at marketing themselves – building their brands, as journalists now put it – in social media. Some were brilliant and fiercely honest, in keeping with the old aspirations of the paper.
  • critically, the Times abandoned its practice of acculturation, including those months-long assignments on Metro covering cops and crime or housing. Many new hires who never spent time in the streets went straight into senior writing and editing roles.
  • All these recruits arrived with their own notions of the purpose of the Times. To me, publishing conservatives helped fulfil the paper’s mission; to them, I think, it betrayed that mission.
  • then, to the shock and horror of the newsroom, Trump won the presidency. In his article for Columbia Journalism Review, Sulzberger cites the Times’s failure to take Trump’s chances seriously as an example of how “prematurely shutting down inquiry and debate” can allow “conventional wisdom to ossify in a way that blinds society.
  • Many Times staff members – scared, angry – assumed the Times was supposed to help lead the resistance. Anxious for growth, the Times’s marketing team implicitly endorsed that idea, too.
  • As the number of subscribers ballooned, the marketing department tracked their expectations, and came to a nuanced conclusion. More than 95% of Times subscribers described themselves as Democrats or independents, and a vast majority of them believed the Times was also liberal
  • A similar majority applauded that bias; it had become “a selling point”, reported one internal marketing memo. Yet at the same time, the marketers concluded, subscribers wanted to believe that the Times was independent.
  • As that memo argued, even if the Times was seen as politically to the left, it was critical to its brand also to be seen as broadening its readers’ horizons, and that required “a perception of independence”.
  • Readers could cancel their subscriptions if the Times challenged their worldview by reporting the truth without regard to politics. As a result, the Times’s long-term civic value was coming into conflict with the paper’s short-term shareholder value
  • The Times has every right to pursue the commercial strategy that makes it the most money. But leaning into a partisan audience creates a powerful dynamic. Nobody warned the new subscribers to the Times that it might disappoint them by reporting truths that conflicted with their expectations
  • When your product is “independent journalism”, that commercial strategy is tricky, because too much independence might alienate your audience, while too little can lead to charges of hypocrisy that strike at the heart of the brand.
  • It became one of Dean Baquet’s frequent mordant jokes that he missed the old advertising-based business model, because, compared with subscribers, advertisers felt so much less sense of ownership over the journalism
  • The Times was slow to break it to its readers that there was less to Trump’s ties to Russia than they were hoping, and more to Hunter Biden’s laptop, that Trump might be right that covid came from a Chinese lab, that masks were not always effective against the virus, that shutting down schools for many months was a bad idea.
  • there has been a sea change over the past ten years in how journalists think about pursuing justice. The reporters’ creed used to have its foundation in liberalism, in the classic philosophical sense. The exercise of a reporter’s curiosity and empathy, given scope by the constitutional protections of free speech, would equip readers with the best information to form their own judgments. The best ideas and arguments would win out
  • The journalist’s role was to be a sworn witness; the readers’ role was to be judge and jury. In its idealised form, journalism was lonely, prickly, unpopular work, because it was only through unrelenting scepticism and questioning that society could advance. If everyone the reporter knew thought X, the reporter’s role was to ask: why X?
  • Illiberal journalists have a different philosophy, and they have their reasons for it. They are more concerned with group rights than individual rights, which they regard as a bulwark for the privileges of white men. They have seen the principle of  free speech used to protect right-wing outfits like Project Veritas and Breitbart News and are uneasy with it.
  • They had their suspicions of their fellow citizens’ judgment confirmed by Trump’s election, and do not believe readers can be trusted with potentially dangerous ideas or facts. They are not out to achieve social justice as the knock-on effect of pursuing truth; they want to pursue it head-on
  • The term “objectivity” to them is code for ignoring the poor and weak and cosying up to power, as journalists often have done.
  • And they do not just want to be part of the cool crowd. They need to be
  • To be more valued by their peers and their contacts – and hold sway over their bosses – they need a lot of followers in social media. That means they must be seen to applaud the right sentiments of the right people in social media
  • The journalist from central casting used to be a loner, contrarian or a misfit. Now journalism is becoming another job for joiners, or, to borrow Twitter’s own parlance, “followers”, a term that mocks the essence of a journalist’s role.
  • The new newsroom ideology seems idealistic, yet it has grown from cynical roots in academia: from the idea that there is no such thing as objective truth; that there is only narrative, and that therefore whoever controls the narrative – whoever gets to tell the version of the story that the public hears – has the whip hand
  • What matters, in other words, is not truth and ideas in themselves, but the power to determine both in the public mind.
  • By contrast, the old newsroom ideology seems cynical on its surface. It used to bug me that my editors at the Times assumed every word out of the mouth of any person in power was a lie.
  • And the pursuit of objectivity can seem reptilian, even nihilistic, in its abjuration of a fixed position in moral contests. But the basis of that old newsroom approach was idealistic: the notion that power ultimately lies in truth and ideas, and that the citizens of a pluralistic democracy, not leaders of any sort, must be trusted to judge both.
  • Our role in Times Opinion, I used to urge my colleagues, was not to tell people what to think, but to help them fulfil their desire to think for themselves.
  • It seems to me that putting the pursuit of truth, rather than of justice, at the top of a publication’s hierarchy of values also better serves not just truth but justice, too
  • over the long term journalism that is not also sceptical of the advocates of any form of justice and the programmes they put forward, and that does not struggle honestly to understand and explain the sources of resistance,
  • will not assure that those programmes will work, and it also has no legitimate claim to the trust of reasonable people who see the world very differently. Rather than advance understanding and durable change, it provokes backlash.
  • The impatience within the newsroom with such old ways was intensified by the generational failure of the Times to hire and promote women and non-white people
  • Pay attention if you are white at the Times and you will hear black editors speak of hiring consultants at their own expense to figure out how to get white staff to respect them
  • As wave after wave of pain and outrage swept through the Times, over a headline that was not damning enough of Trump or someone’s obnoxious tweets, I came to think of the people who were fragile, the ones who were caught up in Slack or Twitter storms, as people who had only recently discovered that they were white and were still getting over the shock.
  • Having concluded they had got ahead by working hard, it has been a revelation to them that their skin colour was not just part of the wallpaper of American life, but a source of power, protection and advancement.
  • I share the bewilderment that so many people could back Trump, given the things he says and does, and that makes me want to understand why they do: the breadth and diversity of his support suggests not just racism is at work. Yet these elite, well-meaning Times staff cannot seem to stretch the empathy they are learning to extend to people with a different skin colour to include those, of whatever race, who have different politics.
  • The digital natives were nevertheless valuable, not only for their skills but also because they were excited for the Times to embrace its future. That made them important allies of the editorial and business leaders as they sought to shift the Times to digital journalism and to replace staff steeped in the ways of print. Partly for that reason, and partly out of fear, the leadership indulged internal attacks on Times journalism, despite pleas from me and others, to them and the company as a whole, that Times folk should treat each other with more respect
  • My colleagues and I in Opinion came in for a lot of the scorn, but we were not alone. Correspondents in the Washington bureau and political reporters would take a beating, too, when they were seen as committing sins like “false balance” because of the nuance in their stories.
  • My fellow editorial and commercial leaders were well aware of how the culture of the institution had changed. As delighted as they were by the Times’s digital transformation they were not blind to the ideological change that came with it. They were unhappy with the bullying and group-think; we often discussed such cultural problems in the weekly meetings of the executive committee, composed of the top editorial and business leaders, including the publisher. Inevitably, these bitch sessions would end with someone saying a version of: “Well, at some point we have to tell them this is what we believe in as a newspaper, and if they don’t like it they should work somewhere else.” It took me a couple of years to realise that this moment was never going to come.
  • There is a lot not to miss about the days when editors like Boyd could strike terror in young reporters like me and Purdum. But the pendulum has swung so far in the other direction that editors now tremble before their reporters and even their interns. “I miss the old climate of fear,” Baquet used to say with a smile, in another of his barbed jokes.
  • I wish I’d pursued my point and talked myself out of the job. This contest over control of opinion journalism within the Times was not just a bureaucratic turf battle (though it was that, too)
  • The newsroom’s embrace of opinion journalism has compromised the Times’s independence, misled its readers and fostered a culture of intolerance and conformity.
  • The Opinion department is a relic of the era when the Times enforced a line between news and opinion journalism.
  • Editors in the newsroom did not touch opinionated copy, lest they be contaminated by it, and opinion journalists and editors kept largely to their own, distant floor within the Times building. Such fastidiousness could seem excessive, but it enforced an ethos that Times reporters owed their readers an unceasing struggle against bias in the news
  • But by the time I returned as editorial-page editor, more opinion columnists and critics were writing for the newsroom than for Opinion. As at the cable news networks, the boundaries between commentary and news were disappearing, and readers had little reason to trust that Times journalists were resisting rather than indulging their biases
  • The Times newsroom had added more cultural critics, and, as Baquet noted, they were free to opine about politics.
  • Departments across the Times newsroom had also begun appointing their own “columnists”, without stipulating any rules that might distinguish them from columnists in Opinion
  • I checked to see if, since I left the Times, it had developed guidelines explaining the difference, if any, between a news columnist and opinion columnist. The paper’s spokeswoman, Danielle Rhoades Ha, did not respond to the question.)
  • The internet rewards opinionated work and, as news editors felt increasing pressure to generate page views, they began not just hiring more opinion writers but also running their own versions of opinionated essays by outside voices – historically, the province of Opinion’s op-ed department.
  • Yet because the paper continued to honour the letter of its old principles, none of this work could be labelled “opinion” (it still isn’t). After all, it did not come from the Opinion department.
  • And so a newsroom technology columnist might call for, say, unionisation of the Silicon Valley workforce, as one did, or an outside writer might argue in the business section for reparations for slavery, as one did, and to the average reader their work would appear indistinguishable from Times news articles.
  • By similarly circular logic, the newsroom’s opinion journalism breaks another of the Times’s commitments to its readers. Because the newsroom officially does not do opinion – even though it openly hires and publishes opinion journalists – it feels free to ignore Opinion’s mandate to provide a diversity of views
  • When I was editorial-page editor, there were a couple of newsroom columnists whose politics were not obvious. But the other newsroom columnists, and the critics, read as passionate progressives.
  • I urged Baquet several times to add a conservative to the newsroom roster of cultural critics. That would serve the readers by diversifying the Times’s analysis of culture, where the paper’s left-wing bias had become most blatant, and it would show that the newsroom also believed in restoring the Times’s commitment to taking conservatives seriously. He said this was a good idea, but he never acted on it
  • I couldn’t help trying the idea out on one of the paper’s top cultural editors, too: he told me he did not think Times readers would be interested in that point of view.
  • opinion was spreading through the newsroom in other ways. News desks were urging reporters to write in the first person and to use more “voice”, but few newsroom editors had experience in handling that kind of journalism, and no one seemed certain where “voice” stopped and “opinion” began
  • The Times magazine, meanwhile, became a crusading progressive publication
  • Baquet liked to say the magazine was Switzerland, by which he meant that it sat between the newsroom and Opinion. But it reported only to the news side. Its work was not labelled as opinion and it was free to omit conservative viewpoints.
  • his creep of politics into the newsroom’s journalism helped the Times beat back some of its new challengers, at least those on the left
  • Competitors like Vox and the HuffPost were blending leftish politics with reporting and writing it up conversationally in the first person. Imitating their approach, along with hiring some of their staff, helped the Times repel them. But it came at a cost. The rise of opinion journalism over the past 15 years changed the newsroom’s coverage and its culture
  • The tiny redoubt of never-Trump conservatives in Opinion is swamped daily not only by the many progressives in that department but their reinforcements among the critics, columnists and magazine writers in the newsroom
  • They are generally excellent, but their homogeneity means Times readers are being served a very restricted range of views, some of them presented as straight news by a publication that still holds itself out as independent of any politics.
  • And because the critics, newsroom columnists and magazine writers are the newsroom’s most celebrated journalists, they have disproportionate influence over the paper’s culture.
  • By saying that it still holds itself to the old standard of strictly separating its news and opinion journalists, the paper leads its readers further into the trap of thinking that what they are reading is independent and impartial – and this misleads them about their country’s centre of political and cultural gravity.
  • And yet the Times insists to the public that nothing has changed.
  • “Even though each day’s opinion pieces are typically among our most popular journalism and our columnists are among our most trusted voices, we believe opinion is secondary to our primary mission of reporting and should represent only a portion of a healthy news diet,” Sulzberger wrote in the Columbia Journalism Review. “For that reason, we’ve long kept the Opinion department intentionally small – it represents well under a tenth of our journalistic staff – and ensured that its editorial decision-making is walled off from the newsroom.”
  • When I was editorial-page editor, Sulzberger, who declined to be interviewed on the record for this article, worried a great deal about the breakdown in the boundaries between news and opinion
  • He told me once that he would like to restructure the paper to have one editor oversee all its news reporters, another all its opinion journalists and a third all its service journalists, the ones who supply guidance on buying gizmos or travelling abroad. Each of these editors would report to him
  • That is the kind of action the Times needs to take now to confront its hypocrisy and begin restoring its independence.
  • The Times could learn something from the Wall Street Journal, which has kept its journalistic poise
  • It has maintained a stricter separation between its news and opinion journalism, including its cultural criticism, and that has protected the integrity of its work.
  • After I was chased out of the Times, Journal reporters and other staff attempted a similar assault on their opinion department. Some 280 of them signed a letter listing pieces they found offensive and demanding changes in how their opinion colleagues approached their work. “Their anxieties aren’t our responsibility,” shrugged the Journal’s editorial board in a note to readers after the letter was leaked. “The signers report to the news editors or other parts of the business.” The editorial added, in case anyone missed the point, “We are not the New York Times.” That was the end of it.
  • Unlike the publishers of the Journal, however, Sulzberger is in a bind, or at least perceives himself to be
  • The confusion within the Times over its role, and the rising tide of intolerance among the reporters, the engineers, the business staff, even the subscribers – these are all problems he inherited, in more ways than one. He seems to feel constrained in confronting the paper’s illiberalism by the very source of his authority
  • The paradox is that in previous generations the Sulzbergers’ control was the bulwark of the paper’s independence.
  • if he is going to instil the principles he believes in, he needs to stop worrying so much about his powers of persuasion, and start using the power he is so lucky to have.
  • Shortly after we published the op-ed that Wednesday afternoon, some reporters tweeted their opposition to Cotton’s argument. But the real action was in the Times’s Slack channels, where reporters and other staff began not just venting but organising. They turned to the union to draw up a workplace complaint about the op-ed.
  • The next day, this reporter shared the byline on the Times story about the op-ed. That article did not mention that Cotton had distinguished between “peaceful, law-abiding protesters” and “rioters and looters”. In fact, the first sentence reported that Cotton had called for “the military to suppress protests against police violence”.
  • This was – and is – wrong. You don’t have to take my word for that. You can take the Times’s
  • Three days later in its article on my resignation it also initially reported that Cotton had called “for military force against protesters in American cities”. This time, after the article was published on the Times website, the editors scrambled to rewrite it, replacing “military force” with “military response” and “protesters” with “civic unrest”
  • That was a weaselly adjustment – Cotton wrote about criminality, not “unrest” – but the article at least no longer unambiguously misrepresented Cotton’s argument to make it seem he was in favour of crushing democratic protest. The Times did not publish a correction or any note acknowledging the story had been changed.
  • Seeking to influence the outcome of a story you cover, particularly without disclosing that to the reader, violates basic principles I was raised on at the Times
  • s Rhoades Ha disputes my characterisation of the after-the-fact editing of the story about my resignation. She said the editors changed the story after it was published on the website in order to “refine” it and “add context”, and so the story did not merit a correction disclosing to the reader that changes had been made.
  • In retrospect what seems almost comical is that as the conflict over Cotton’s op-ed unfolded within the Times I acted as though it was on the level, as though the staff of the Times would have a good-faith debate about Cotton’s piece and the decision to publish it
  • Instead, people wanted to vent and achieve what they considered to be justice, whether through Twitter, Slack, the union or the news pages themselves
  • My colleagues in Opinion, together with the PR team, put together a series of connected tweets describing the purpose behind publishing Cotton’s op-ed. Rather than publish these tweets from the generic Times Opinion Twitter account, Sulzberger encouraged me to do it from my personal one, on the theory that this would humanise our defence. I doubted that would make any difference, but it was certainly my job to take responsibility. So I sent out the tweets, sticking my head in a Twitter bucket that clangs, occasionally, to this day
  • What is worth recalling now from the bedlam of the next two days? I suppose there might be lessons for someone interested in how not to manage a corporate crisis. I began making my own mistakes that Thursday. The union condemned our publication of Cotton, for supposedly putting journalists in danger, claiming that he had called on the military “to ‘detain’ and ‘subdue’ Americans protesting racism and police brutality” – again, a misrepresentation of his argument. The publisher called to tell me the company was experiencing its largest sick day in history; people were turning down job offers because of the op-ed, and, he said, some people were quitting. He had been expecting for some time that the union would seek a voice in editorial decision-making; he said he thought this was the moment the union was making its move. He had clearly changed his own mind about the value of publishing the Cotton op-ed.
  • I asked Dao to have our fact-checkers review the union’s claims. But then I went a step further: at the publisher’s request, I urged him to review the editing of the piece itself and come back to me with a list of steps we could have taken to make it better. Dao’s reflex – the correct one – was to defend the piece as published. He and three other editors of varying ages, genders and races had helped edit it; it had been fact-checked, as is all our work
  • This was my last failed attempt to have the debate within the Times that I had been seeking for four years, about why it was important to present Times readers with arguments like Cotton’s. The staff at the paper never wanted to have that debate. The Cotton uproar was the most extreme version of the internal reaction we faced whenever we published conservative arguments that were not simply anti-Trump. Yes, yes, of course we believe in the principle of publishing diverse views, my Times colleagues would say, but why this conservative? Why this argument?
  • I doubt these changes would have mattered, and to extract this list from Dao was to engage in precisely the hypocrisy I claimed to despise – that, in fact, I do despise. If Cotton needed to be held to such standards of politesse, so did everyone else. Headlines such as “Tom Cotton’s Fascist Op-ed”, the headline of a subsequent piece, should also have been tranquillised.
  • As that miserable Thursday wore on, Sulzberger, Baquet and I held a series of Zoom meetings with reporters and editors from the newsroom who wanted to discuss the op-ed. Though a handful of the participants were there to posture, these were generally constructive conversations. A couple of people, including Baquet, even had the guts to speak up in favour of publishing the op-ed
  • Two moments stick out. At one point, in answer to a question, Sulzberger and Baquet both said they thought the op-ed – as the Times union and many journalists were saying – had in fact put journalists in danger. That was the first time I realised I might be coming to the end of the road.
  • The other was when a pop-culture reporter asked if I had read the op-ed before it was published. I said I had not. He immediately put his head down and started typing, and I should have paid attention rather than moving on to the next question. He was evidently sharing the news with the company over Slack.
  • Every job review I had at the Times urged me to step back from the daily coverage to focus on the long term. (Hilariously, one review, urging me to move faster in upending the Opinion department, instructed me to take risks and “ask for forgiveness not permission”.)
  • I learned when these meetings were over that there had been a new eruption in Slack. Times staff were saying that Rubenstein had been the sole editor of the op-ed. In response, Dao had gone into Slack to clarify to the entire company that he had also edited it himself. But when the Times posted the news article that evening, it reported, “The Op-Ed was edited by Adam Rubenstein” and made no mention of Dao’s statement
  • Early that morning, I got an email from Sam Dolnick, a Sulzberger cousin and a top editor at the paper, who said he felt “we” – he could have only meant me – owed the whole staff “an apology for appearing to place an abstract idea like open debate over the value of our colleagues’ lives, and their safety”. He was worried that I and my colleagues had unintentionally sent a message to other people at the Times that: “We don’t care about their full humanity and their security as much as we care about our ideas.”
  • “I know you don’t like it when I talk about principles at a moment like this,” I began. But I viewed the journalism I had been doing, at the Times and before that at the Atlantic, in very different terms from the ones Dolnick presumed. “I don’t think of our work as an abstraction without meaning for people’s lives – quite the opposite,” I continued. “The whole point – the reason I do this – is to have an impact on their lives to the good. I have always believed that putting ideas, including potentially dangerous one[s], out in the public is vital to ensuring they are debated and, if dangerous, discarded.” It was, I argued, in “edge cases like this that principles are tested”, and if my position was judged wrong then “I am out of step with the times.” But, I concluded, “I don’t think of us as some kind of debating society without implications for the real world and I’ve never been unmindful of my colleagues’ humanity.”
  • in the end, one thing he and I surely agree on is that I was, in fact, out of step with the Times. It may have raised me as a journalist – and invested so much in educating me to what were once its standards – but I did not belong there any more.
  • Finally, I came up with something that felt true. I told the meeting that I was sorry for the pain that my leadership of Opinion had caused. What a pathetic thing to say. I did not think to add, because I’d lost track of this truth myself by then, that opinion journalism that never causes pain is not journalism. It can’t hope to move society forward
  • As I look back at my notes of that awful day, I don’t regret what I said. Even during that meeting, I was still hoping the blow-up might at last give me the chance either to win support for what I had been asked to do, or to clarify once and for all that the rules for journalism had changed at the Times.
  • But no one wanted to talk about that. Nor did they want to hear about all the voices of vulnerable or underprivileged people we had been showcasing in Opinion, or the ambitious new journalism we were doing. Instead, my Times colleagues demanded to know things such as the names of every editor who had had a role in the Cotton piece. Having seen what happened to Rubenstein I refused to tell them. A Slack channel had been set up to solicit feedback in real time during the meeting, and it was filling with hate. The meeting ran long, and finally came to a close after 90 minutes.
  • I tried to insist, as did Dao, that the note make clear the Cotton piece was within our editorial bounds. Sulzberger said he felt the Times could afford to be “silent” on that question. In the end the note went far further in repudiating the piece than I anticipated, saying it should never have been published at all. The next morning I was told to resign.
  • It was a terrible moment for the country. By the traditional – and perverse – logic of journalism, that should also have made it an inspiring time to be a reporter, writer or editor. Journalists are supposed to run towards scenes that others are fleeing, towards hard truths others need to know, towards consequential ideas they would prefer to ignore.
  • But fear got all mixed up with anger inside the Times, too, along with a desire to act locally in solidarity with the national movement. That energy found a focus in the Cotton op-ed
  • the Times is not good at acknowledging mistakes. Indeed, one of my own, within the Times culture, was to take responsibility for any mistakes my department made, and even some it didn’t
  • To Sulzberger, the meltdown over Cotton’s op-ed and my departure in disgrace are explained and justified by a failure of editorial “process”. As he put it in an interview with the New Yorker this summer, after publishing his piece in the Columbia Journalism Review, Cotton’s piece was not “perfectly fact-checked” and the editors had not “thought about the headline and presentation”. He contrasted the execution of Cotton’s opinion piece with that of a months-long investigation the newsroom did of Donald Trump’s taxes (which was not “perfectly fact-checked”, as it happens – it required a correction). He did not explain why, if the Times was an independent publication, an op-ed making a mainstream conservative argument should have to meet such different standards from an op-ed making any other kind of argument, such as for the abolition of the police
  • “It’s not enough just to have the principle and wave it around,” he said. “You also have to execute on it.”
  • To me, extolling the virtue of independent journalism in the pages of the Columbia Journalism Review is how you wave a principle around. Publishing a piece like Cotton’s is how you execute on it.
  • As Sulzberger also wrote in the Review, “Independent journalism, especially in a pluralistic democracy, should err on the side of treating areas of serious political contest as open, unsettled, and in need of further inquiry.
  • If Sulzberger must insist on comparing the execution of the Cotton op-ed with that of the most ambitious of newsroom projects, let him compare it with something really important, the 1619 Project, which commemorated the 400th anniversary of the arrival of enslaved Africans in Virginia.
  • Like Cotton’s piece, the 1619 Project was fact-checked and copy-edited (most of the Times newsroom does not fact-check or copy-edit articles, but the magazine does). But it nevertheless contained mistakes, as journalism often does. Some of these mistakes ignited a firestorm among historians and other readers.
  • And, like Cotton’s piece, the 1619 Project was presented in a way the Times later judged to be too provocative.
  • The Times declared that the 1619 Project “aims to reframe the country’s history, understanding 1619 as our true founding”. That bold statement – a declaration of Times fact, not opinion, since it came from the newsroom – outraged many Americans who venerated 1776 as the founding. The Times later stealthily erased it from the digital version of the project, but was caught doing so by a writer for the publication Quillette. Sulzberger told me during the initial uproar that the top editors in the newsroom – not just Baquet but his deputy – had not reviewed the audacious statement of purpose, one of the biggest editorial claims the paper has ever made. They also, of course, did not edit all the pieces themselves, trusting the magazine’s editors to do that work.
  • If the 1619 Project and the Cotton op-ed shared the same supposed flaws and excited similar outrage, how come that one is lauded as a landmark success and the other is a sackable offence?
  • I am comparing them only to meet Sulzberger on his terms, in order to illuminate what he is trying to elide. What distinguished the Cotton piece was not an error, or strong language, or that I didn’t edit it personally. What distinguished that op-ed was not process. It was politics.
  • It is one thing for the Times to aggravate historians, or conservatives, or even old-school liberals who believe in open debate. It has become quite another for the Times to challenge some members of its own staff with ideas that might contradict their view of the world.
  • The lessons of the incident are not about how to write a headline but about how much the Times has changed – how digital technology, the paper’s new business model and the rise of new ideals among its staff have altered its understanding of the boundary between news and opinion, and of the relationship between truth and justice
  • Ejecting me was one way to avoid confronting the question of which values the Times is committed to. Waving around the word “process” is another.
  • As he asserts the independence of Times journalism, Sulzberger is finding it necessary to reach back several years to another piece I chose to run, for proof that the Times remains willing to publish views that might offend its staff. “We’ve published a column by the head of the part of the Taliban that kidnapped one of our own journalists,” he told the New Yorker. He is missing the real lesson of that piece, as well.
  • The case against that piece is that Haqqani, who remains on the FBI’s most-wanted terrorist list, may have killed Americans. It’s puzzling: in what moral universe can it be a point of pride to publish a piece by an enemy who may have American blood on his hands, and a matter of shame to publish a piece by an American senator arguing for American troops to protect Americans?
  • As Mitch McConnell, then the majority leader, said on the Senate floor about the Times’s panic over the Cotton op-ed, listing some other debatable op-ed choices, “Vladimir Putin? No problem. Iranian propaganda? Sure. But nothing, nothing could have prepared them for 800 words from the junior senator from Arkansas.”
  • The Times’s staff members are not often troubled by obnoxious views when they are held by foreigners. This is an important reason the paper’s foreign coverage, at least of some regions, remains exceptional.
  • What seems most important and least understood about that episode is that it demonstrated in real time the value of the ideals that I poorly defended in the moment, ideals that not just the Times’s staff but many other college-educated Americans are abandoning.
  • After all, we ran the experiment; we published the piece. Was any Times journalist hurt? No. Nobody in the country was. In fact, though it is impossible to know the op-ed’s precise effect, polling showed that support for a military option dropped after the Times published the essay, as the Washington Post’s media critic, Erik Wemple, has written
  • If anything, in other words, publishing the piece stimulated debate that made it less likely Cotton’s position would prevail. The liberal, journalistic principle of open debate was vindicated in the very moment the Times was fleeing from it.
Javier E

Opinion | Vaccine Hesitancy Is About Trust and Class - The New York Times - 0 views

  • The world needs to address the root causes of vaccine hesitancy. We can’t go on believing that the issue can be solved simply by flooding skeptical communities with public service announcements or hectoring people to “believe in science.”
  • For the past five years, we’ve conducted surveys and focus groups abroad and interviewed residents of the Bronx to better understand vaccine avoidance.
  • We’ve found that people who reject vaccines are not necessarily less scientifically literate or less well-informed than those who don’t. Instead, hesitancy reflects a transformation of our core beliefs about what we owe one another.
  • ...43 more annotations...
  • Over the past four decades, governments have slashed budgets and privatized basic services. This has two important consequences for public health
  • First, people are unlikely to trust institutions that do little for them.
  • second, public health is no longer viewed as a collective endeavor, based on the principle of social solidarity and mutual obligation. People are conditioned to believe they’re on their own and responsible only for themselves.
  • an important source of vaccine hesitancy is the erosion of the idea of a common good.
  • “People are thinking, ‘If the government isn’t going to do anything for us,’” said Elden, “‘then why should we participate in vaccines?’”
  • Since the spring, when most American adults became eligible for Covid vaccines, the racial gap in vaccination rates between Black and white people has been halved. In September, a national survey found that vaccination rates among Black and white Americans were almost identical.
  • Other surveys have determined that a much more significant factor was college attendance: Those without a college degree were the most likely to go unvaccinated.
  • Education is a reliable predictor of socioeconomic status, and other studies have similarly found a link between income and vaccination.
  • It turns out that the real vaccination divide is class.
  • compared with white Americans, communities of color do experience the American health care system differently. But a closer look at the data reveals a more complicated picture.
  • during the 1950s polio campaigns, for example, most people saw vaccination as a civic duty.
  • But as the public purse shrunk in the 1980s, politicians insisted that it’s no longer the government’s job to ensure people’s well-being; instead, Americans were to be responsible only for themselves and their own bodies
  • Entire industries, such as self-help and health foods, have sprung up on the principle that the key to good health lies in individuals making the right choices.
  • Without an idea of the common good, health is often discussed using the language of “choice.”
  • there are problems with reducing public health to a matter of choice. It gives the impression that individuals are wholly responsible for their own health.
  • This is despite growing evidence that health is deeply influenced by factors outside our control; public health experts now talk about the “social determinants of health,” the idea that personal health is never simply just a reflection of individual lifestyle choices, but also the class people are born into, the neighborhood they grew up in and the race they belong to.
  • food deserts and squalor are not easy problems to solve — certainly not by individuals or charities — and they require substantial government action.
  • Many medical schools teach “motivational interviewing,”
  • the deeper problem:
  • Being healthy is not cheap. Studies indicate that energy-dense foods with less nutritious value are more affordable, and low-cost diets are linked to obesity and insulin resistance.
  • This isn’t surprising, since we shop for doctors and insurance plans the way we do all other goods and services
  • Another problem with reducing well-being to personal choice is that this treats health as a commodity.
  • mothers devoted many hours to “researching” vaccines, soaking up parental advice books and quizzing doctors. In other words, they act like savvy consumers
  • When thinking as a consumer, people tend to downplay social obligations in favor of a narrow pursuit of self-interest. As one parent told Reich, “I’m not going to put my child at risk to save another child.”
  • Such risk-benefit assessments for vaccines are an essential part of parents’ consumer research.
  • Vaccine uptake is so high among wealthy people because Covid is one of the gravest threats they face. In some wealthy Manhattan neighborhoods, for example, vaccination rates run north of 90 percent.
  • For poorer and working-class people, though, the calculus is different: Covid-19 is only one of multiple grave threats.
  • When viewed in the context of the other threats they face, Covid no longer seems uniquely scary.
  • Most of the people we interviewed in the Bronx say they are skeptical of the institutions that claim to serve the poor but in fact have abandoned them.
  • he and his friends find reason to view the government’s sudden interest in their well-being with suspicion. “They are over here shoving money at us,” a woman told us, referring to a New York City offer to pay a $500 bonus to municipal workers to get vaccinated. “And I’m asking, why are you so eager, when you don’t give us money for anything else?”
  • These views reinforce the work of social scientists who find a link between a lack of trust and inequality. And without trust, there is no mutual obligation, no sense of a common good.
  • The experience of the 1960s suggests that when people feel supported through social programs, they’re more likely to trust institutions and believe they have a stake in society’s health.
  • Research shows that private systems not only tend to produce worse health outcomes than public ones, but privatization creates what public health experts call “segregated care,” which can undermine the feelings of social solidarity that are critical for successful vaccination drives
  • In one Syrian city, for example, the health care system now consists of one public hospital so underfunded that it is notorious for poor care, a few private hospitals offering high-quality care that are unaffordable to most of the population, and many unlicensed and unregulated private clinics — some even without medical doctors — known to offer misguided health advice. Under such conditions, conspiracy theories can flourish; many of the city’s residents believe Covid vaccines are a foreign plot.
  • In many developing nations, international aid organizations are stepping in to offer vaccines. These institutions are sometimes more equitable than governments, but they are often oriented to donor priorities, not community needs.
  • “We have starvation and women die in childbirth.” one tribal elder told us, “Why do they care so much about polio? What do they really want?”
  • In America, anti-vaccine movements are as old as vaccines themselves; efforts to immunize people against smallpox prompted bitter opposition in the turn of the last century. But after World War II, these attitudes disappeared. In the 1950s, demand for the polio vaccine often outstripped supply, and by the late 1970s, nearly every state had laws mandating vaccinations for school with hardly any public opposition.
  • What changed? This was the era of large, ambitious government programs like Medicare and Medicaid.
  • The anti-measles policy, for example, was an outgrowth of President Lyndon Johnson’s Great Society and War on Poverty initiatives.
  • While the reasons vary by country, the underlying causes are the same: a deep mistrust in local and international institutions, in a context in which governments worldwide have cut social services.
  • Only then do the ideas of social solidarity and mutual obligation begin to make sense.
  • The types of social programs that best promote this way of thinking are universal ones, like Social Security and universal health care.
  • If the world is going to beat the pandemic, countries need policies that promote a basic, but increasingly forgotten, idea: that our individual flourishing is bound up in collective well-being.
Javier E

He Could Have Seen What Was Coming: Behind Trump's Failure on the Virus - The New York ... - 0 views

  • “Any way you cut it, this is going to be bad,” a senior medical adviser at the Department of Veterans Affairs, Dr. Carter Mecher, wrote on the night of Jan. 28, in an email to a group of public health experts scattered around the government and universities. “The projected size of the outbreak already seems hard to believe.”
  • A week after the first coronavirus case had been identified in the United States, and six long weeks before President Trump finally took aggressive action to confront the danger the nation was facing — a pandemic that is now forecast to take tens of thousands of American lives — Dr. Mecher was urging the upper ranks of the nation’s public health bureaucracy to wake up and prepare for the possibility of far more drastic action.
  • Throughout January, as Mr. Trump repeatedly played down the seriousness of the virus and focused on other issues, an array of figures inside his government — from top White House advisers to experts deep in the cabinet departments and intelligence agencies — identified the threat, sounded alarms and made clear the need for aggressive action.
  • ...68 more annotations...
  • The president, though, was slow to absorb the scale of the risk and to act accordingly, focusing instead on controlling the message, protecting gains in the economy and batting away warnings from senior officials.
  • Mr. Trump’s response was colored by his suspicion of and disdain for what he viewed as the “Deep State” — the very people in his government whose expertise and long experience might have guided him more quickly toward steps that would slow the virus, and likely save lives.
  • The slow start of that plan, on top of the well-documented failures to develop the nation’s testing capacity, left administration officials with almost no insight into how rapidly the virus was spreading. “We were flying the plane with no instruments,” one official said.
  • But dozens of interviews with current and former officials and a review of emails and other records revealed many previously unreported details and a fuller picture of the roots and extent of his halting response as the deadly virus spread:
  • The National Security Council office responsible for tracking pandemics received intelligence reports in early January predicting the spread of the virus to the United States, and within weeks was raising options like keeping Americans home from work and shutting down cities the size of Chicago. Mr. Trump would avoid such steps until March.
  • Despite Mr. Trump’s denial weeks later, he was told at the time about a Jan. 29 memo produced by his trade adviser, Peter Navarro, laying out in striking detail the potential risks of a coronavirus pandemic: as many as half a million deaths and trillions of dollars in economic losses.
  • The health and human services secretary, Alex M. Azar II, directly warned Mr. Trump of the possibility of a pandemic during a call on Jan. 30, the second warning he delivered to the president about the virus in two weeks. The president, who was on Air Force One while traveling for appearances in the Midwest, responded that Mr. Azar was being alarmist
  • Mr. Azar publicly announced in February that the government was establishing a “surveillance” system
  • the task force had gathered for a tabletop exercise — a real-time version of a full-scale war gaming of a flu pandemic the administration had run the previous year. That earlier exercise, also conducted by Mr. Kadlec and called “Crimson Contagion,” predicted 110 million infections, 7.7 million hospitalizations and 586,000 deaths following a hypothetical outbreak that started in China.
  • By the third week in February, the administration’s top public health experts concluded they should recommend to Mr. Trump a new approach that would include warning the American people of the risks and urging steps like social distancing and staying home from work.
  • But the White House focused instead on messaging and crucial additional weeks went by before their views were reluctantly accepted by the president — time when the virus spread largely unimpeded.
  • When Mr. Trump finally agreed in mid-March to recommend social distancing across the country, effectively bringing much of the economy to a halt, he seemed shellshocked and deflated to some of his closest associates. One described him as “subdued” and “baffled” by how the crisis had played out. An economy that he had wagered his re-election on was suddenly in shambles.
  • He only regained his swagger, the associate said, from conducting his daily White House briefings, at which he often seeks to rewrite the history of the past several months. He declared at one point that he “felt it was a pandemic long before it was called a pandemic,” and insisted at another that he had to be a “cheerleader for the country,” as if that explained why he failed to prepare the public for what was coming.
  • Mr. Trump’s allies and some administration officials say the criticism has been unfair.
  • The Chinese government misled other governments, they say. And they insist that the president was either not getting proper information, or the people around him weren’t conveying the urgency of the threat. In some cases, they argue, the specific officials he was hearing from had been discredited in his eyes, but once the right information got to him through other channels, he made the right calls.
  • “While the media and Democrats refused to seriously acknowledge this virus in January and February, President Trump took bold action to protect Americans and unleash the full power of the federal government to curb the spread of the virus, expand testing capacities and expedite vaccine development even when we had no true idea the level of transmission or asymptomatic spread,” said Judd Deere, a White House spokesman.
  • Decision-making was also complicated by a long-running dispute inside the administration over how to deal with China
  • The Containment IllusionBy the last week of February, it was clear to the administration’s public health team that schools and businesses in hot spots would have to close. But in the turbulence of the Trump White House, it took three more weeks to persuade the president that failure to act quickly to control the spread of the virus would have dire consequences.
  • There were key turning points along the way, opportunities for Mr. Trump to get ahead of the virus rather than just chase it. There were internal debates that presented him with stark choices, and moments when he could have chosen to ask deeper questions and learn more. How he handled them may shape his re-election campaign. They will certainly shape his legacy.
  • Facing the likelihood of a real pandemic, the group needed to decide when to abandon “containment” — the effort to keep the virus outside the U.S. and to isolate anyone who gets infected — and embrace “mitigation” to thwart the spread of the virus inside the country until a vaccine becomes available.
  • Among the questions on the agenda, which was reviewed by The New York Times, was when the department’s secretary, Mr. Azar, should recommend that Mr. Trump take textbook mitigation measures “such as school dismissals and cancellations of mass gatherings,” which had been identified as the next appropriate step in a Bush-era pandemic plan.
  • The group — including Dr. Anthony S. Fauci of the National Institutes of Health; Dr. Robert R. Redfield of the Centers for Disease Control and Prevention, and Mr. Azar, who at that stage was leading the White House Task Force — concluded they would soon need to move toward aggressive social distancing
  • A 20-year-old Chinese woman had infected five relatives with the virus even though she never displayed any symptoms herself. The implication was grave — apparently healthy people could be unknowingly spreading the virus — and supported the need to move quickly to mitigation.
  • The following day, Dr. Kadlec and the others decided to present Mr. Trump with a plan titled “Four Steps to Mitigation,” telling the president that they needed to begin preparing Americans for a step rarely taken in United States history.
  • a presidential blowup and internal turf fights would sidetrack such a move. The focus would shift to messaging and confident predictions of success rather than publicly calling for a shift to mitigation.
  • These final days of February, perhaps more than any other moment during his tenure in the White House, illustrated Mr. Trump’s inability or unwillingness to absorb warnings coming at him.
  • He instead reverted to his traditional political playbook in the midst of a public health calamity, squandering vital time as the coronavirus spread silently across the country.
  • A memo dated Feb. 14, prepared in coordination with the National Security Council and titled “U.S. Government Response to the 2019 Novel Coronavirus,” documented what more drastic measures would look like, including: “significantly limiting public gatherings and cancellation of almost all sporting events, performances, and public and private meetings that cannot be convened by phone. Consider school closures. Widespread ‘stay at home’ directives from public and private organizations with nearly 100% telework for some.”
  • his friend had a blunt message: You need to be ready. The virus, he warned, which originated in the city of Wuhan, was being transmitted by people who were showing no symptoms — an insight that American health officials had not yet accepted.
  • On the 18-hour plane ride home, Mr. Trump fumed as he watched the stock market crash after Dr. Messonnier’s comments. Furious, he called Mr. Azar when he landed at around 6 a.m. on Feb. 26, raging that Dr. Messonnier had scared people unnecessarily.
  • The meeting that evening with Mr. Trump to advocate social distancing was canceled, replaced by a news conference in which the president announced that the White House response would be put under the command of Vice President Mike Pence.
  • The push to convince Mr. Trump of the need for more assertive action stalled. With Mr. Pence and his staff in charge, the focus was clear: no more alarmist messages. Statements and media appearances by health officials like Dr. Fauci and Dr. Redfield would be coordinated through Mr. Pence’s office
  • It would be more than three weeks before Mr. Trump would announce serious social distancing efforts, a lost period during which the spread of the virus accelerated rapidly.Over nearly three weeks from Feb. 26 to March 16, the number of confirmed coronavirus cases in the United States grew from 15 to 4,226
  • The China FactorThe earliest warnings about coronavirus got caught in the crosscurrents of the administration’s internal disputes over China. It was the China hawks who pushed earliest for a travel ban. But their animosity toward China also undercut hopes for a more cooperative approach by the world’s two leading powers to a global crisis.
  • It was early January, and the call with a Hong Kong epidemiologist left Matthew Pottinger rattled.
  • Mr. Trump was walking up the steps of Air Force One to head home from India on Feb. 25 when Dr. Nancy Messonnier, the director of the National Center for Immunization and Respiratory Diseases, publicly issued the blunt warning they had all agreed was necessary.
  • It was one of the earliest warnings to the White House, and it echoed the intelligence reports making their way to the National Security Council
  • some of the more specialized corners of the intelligence world were producing sophisticated and chilling warnings.
  • In a report to the director of national intelligence, the State Department’s epidemiologist wrote in early January that the virus was likely to spread across the globe, and warned that the coronavirus could develop into a pandemic
  • Working independently, a small outpost of the Defense Intelligence Agency, the National Center for Medical Intelligence, came to the same conclusion.
  • By mid-January there was growing evidence of the virus spreading outside China. Mr. Pottinger began convening daily meetings about the coronavirus
  • The early alarms sounded by Mr. Pottinger and other China hawks were freighted with ideology — including a push to publicly blame China that critics in the administration say was a distraction
  • And they ran into opposition from Mr. Trump’s economic advisers, who worried a tough approach toward China could scuttle a trade deal that was a pillar of Mr. Trump’s re-election campaign.
  • Mr. Pottinger continued to believe the coronavirus problem was far worse than the Chinese were acknowledging. Inside the West Wing, the director of the Domestic Policy Council, Joe Grogan, also tried to sound alarms that the threat from China was growing.
  • The Consequences of ChaosThe chaotic culture of the Trump White House contributed to the crisis. A lack of planning and a failure to execute, combined with the president’s focus on the news cycle and his preference for following his gut rather than the data cost time, and perhaps lives.
  • the hawks kept pushing in February to take a critical stance toward China amid the growing crisis. Mr. Pottinger and others — including aides to Secretary of State Mike Pompeo — pressed for government statements to use the term “Wuhan Virus.”Mr. Pompeo tried to hammer the anti-China message at every turn, eventually even urging leaders of the Group of 7 industrialized countries to use “Wuhan virus” in a joint statement.
  • Others, including aides to Mr. Pence, resisted taking a hard public line, believing that angering Beijing might lead the Chinese government to withhold medical supplies, pharmaceuticals and any scientific research that might ultimately lead to a vaccine.
  • Mr. Trump took a conciliatory approach through the middle of March, praising the job Mr. Xi was doing.
  • That changed abruptly, when aides informed Mr. Trump that a Chinese Foreign Ministry spokesman had publicly spun a new conspiracy about the origins of Covid-19: that it was brought to China by U.S. Army personnel who visited the country last October.
  • On March 16, he wrote on Twitter that “the United States will be powerfully supporting those industries, like Airlines and others, that are particularly affected by the Chinese Virus.”
  • Mr. Trump’s decision to escalate the war of words undercut any remaining possibility of broad cooperation between the governments to address a global threat
  • Mr. Pottinger, backed by Mr. O’Brien, became one of the driving forces of a campaign in the final weeks of January to convince Mr. Trump to impose limits on travel from China
  • he circulated a memo on Jan. 29 urging Mr. Trump to impose the travel limits, arguing that failing to confront the outbreak aggressively could be catastrophic, leading to hundreds of thousands of deaths and trillions of dollars in economic losses.
  • The uninvited message could not have conflicted more with the president’s approach at the time of playing down the severity of the threat. And when aides raised it with Mr. Trump, he responded that he was unhappy that Mr. Navarro had put his warning in writing.
  • From the time the virus was first identified as a concern, the administration’s response was plagued by the rivalries and factionalism that routinely swirl around Mr. Trump and, along with the president’s impulsiveness, undercut decision making and policy development.
  • Even after Mr. Azar first briefed him about the potential seriousness of the virus during a phone call on Jan. 18 while the president was at his Mar-a-Lago resort in Florida, Mr. Trump projected confidence that it would be a passing problem.
  • “We have it totally under control,” he told an interviewer a few days later while attending the World Economic Forum in Switzerland. “It’s going to be just fine.”
  • The efforts to sort out policy behind closed doors were contentious and sometimes only loosely organized.
  • That was the case when the National Security Council convened a meeting on short notice on the afternoon of Jan. 27. The Situation Room was standing room only, packed with top White House advisers, low-level staffers, Mr. Trump’s social media guru, and several cabinet secretaries. There was no checklist about the preparations for a possible pandemic,
  • Instead, after a 20-minute description by Mr. Azar of his department’s capabilities, the meeting was jolted when Stephen E. Biegun, the newly installed deputy secretary of state, announced plans to issue a “level four” travel warning, strongly discouraging Americans from traveling to China. The room erupted into bickering.
  • A few days later, on the evening of Jan. 30, Mick Mulvaney, the acting White House chief of staff at the time, and Mr. Azar called Air Force One as the president was making the final decision to go ahead with the restrictions on China travel. Mr. Azar was blunt, warning that the virus could develop into a pandemic and arguing that China should be criticized for failing to be transparent.
  • Stop panicking, Mr. Trump told him.That sentiment was present throughout February, as the president’s top aides reached for a consistent message but took few concrete steps to prepare for the possibility of a major public health crisis.
  • As February gave way to March, the president continued to be surrounded by divided factions even as it became clearer that avoiding more aggressive steps was not tenable.
  • the virus was already multiplying across the country — and hospitals were at risk of buckling under the looming wave of severely ill people, lacking masks and other protective equipment, ventilators and sufficient intensive care beds. The question loomed over the president and his aides after weeks of stalling and inaction: What were they going to do?
  • Even then, and even by Trump White House standards, the debate over whether to shut down much of the country to slow the spread was especially fierce.
  • In a tense Oval Office meeting, when Mr. Mnuchin again stressed that the economy would be ravaged, Mr. O’Brien, the national security adviser, who had been worried about the virus for weeks, sounded exasperated as he told Mr. Mnuchin that the economy would be destroyed regardless if officials did nothing.
  • in the end, aides said, it was Dr. Deborah L. Birx, the veteran AIDS researcher who had joined the task force, who helped to persuade Mr. Trump. Soft-spoken and fond of the kind of charts and graphs Mr. Trump prefers, Dr. Birx did not have the rough edges that could irritate the president. He often told people he thought she was elegant.
  • During the last week in March, Kellyanne Conway, a senior White House adviser involved in task force meetings, gave voice to concerns other aides had. She warned Mr. Trump that his wished-for date of Easter to reopen the country likely couldn’t be accomplished. Among other things, she told him, he would end up being blamed by critics for every subsequent death caused by the virus.
Javier E

How China's buses shaped the world's EV revolution - BBC Future - 0 views

  • After around two decades of government support, China now boasts the world's largest market for e-buses, making up more than 95% of global stock. At the end of 2022, China's Ministry of Transport announced that more than three-quarters (77% or 542,600) of all urban buses in the country were "new energy vehicles", a term used by the Chinese government to include pure electric, plug-in hybrids, and fuel cell vehicles powered by alternative fuels such as hydrogen and methanol. In 2022, around 84% of the new energy bus fleet was pure electric.
  • . In 2015, 78% of Chinese urban buses still used diesel or gas, according to the World Resources Institute (WRI). The NGO now estimates that if China follows through on its stated decarbonisation policies, its road transport emissions will peak before 2030.
  • China is also home to some of the world's biggest electric bus manufacturers, such as Yutong, which has been raking up orders across China, Europe and Latin America.
  • ...32 more annotations...
  • "China has really been at the forefront of success in conversion of all vehicles to electric vehicles, especially buses," says Heather Thompson, chief executive officer of the Institute for Transportation and Development Policy (ITDP), a non-profit focusing on sustainable transport solutions. "The rest of the world is trying to do the same, but I think China is really out ahead."
  • At the time of China's 2001 entry into the World Trade Organisation, the international automotive industry was dominated by European, US and Japanese brands. These companies had spent decades perfecting internal combustion engine technology. To compete, Beijing decided to find a new track for its auto industry: making cars that did not use conventional engines.
  • That same year, the central government launched the so-called "863 plan" for EV research and development. There were numerous practical challenges, however, in the way of mass electrification. Not many manufacturers were making new energy vehicles, buyers were few and there was a lack of charging infrastructure in existence. The answer? Buses.
  • "The Chinese government adopted a very smart strategy," says Liu Daizong, ITDP's East Asia director. "They realised quite early on that they should drive [the EV industry] through electric buses," he notes, since their public service status meant Beijing "could have a strong hand on their electrification".
  • "Bus routes were fixed. This means when an electric bus finished a round, it could return to the depot to recharge," explains Xue Lulu, a mobility manager at the World Resources Institute (WRI) China. The typical daily mileage of a Chinese bus ­– 200km (120 miles) – was a realistic range for battery makers to meet.
  • The following year, the country began its large-scale rollout of new energy buses, with the "Ten Cities and Thousand Vehicles" programme. Over three years, the programme aimed to provide 10 cities with financial subsidies to promote 1,000 public-sector new energy vehicles in each, annually. Its goal was to have 10% new energy vehicles in the country by the end of 2012.
  • Strong policy support from both central and regional governments "gave manufacturers confidence in setting up production lines and stepping up research efforts," says Liu.
  • Together, these strong and consistent government signals encouraged Chinese manufacturers to expand their EV production capacity, bring down costs and improve their technologies. One such company was Build Your Dream, better known as BYD. The Shenzhen-based firm, the world's largest EV maker in 2022, ballooned its business a decade before by supplying electric buses and taxis for China's EV pilot cities.
  • "Back then, most buses used diesel, which was a main source of nitrogen oxides (NOx) emissions," says Xue, referring to the air pollution that smothered Beijing and other Chinese cities in the early 2010s. Yet in 2013, a new plan from central government cited tackling air pollution as one of the reasons for rolling out EVs.
  • This addition proved to be critical: it not only connected EV uptake with people's health, it also indirectly tied the e-bus campaign to local officials' political performance, as the central government would soon hand air-quality targets to all provinces.
  • The years 2013 and 2014 proved to be important for China's EV push. For the first time, the central government made EV purchase subsidies available to individual consumers, not just the public sector, opening the floodgate to private ownership. Additionally, it offered discounted electricity tariffs to bus operators to make sure the cost of running electric buses would be "significantly lower than" that of their oil or gas-powered equivalents.
  • The new economic push, plus local government's determination to battle air pollution, generated great enthusiasm for e-buses. By the end of 2015, the number of EV pilot cities rocketed from 25 to 88. In the same year, the central government set a target of 200,000 new energy buses on the road by 2020 and announced a plan to phase out its subsidies for fossil-fuel-powered buses.
  • To further stimulate the market, many cities devised various local policies on top of national incentives. For example, Shenzhen, a southern city with a population of more than 17 million, encouraged government agencies to work with private companies to create a full range of renting mechanisms for bus operators
  • Different cities' bus operators also designed different charging strategies. "Buses in Shenzhen had bigger batteries, so they normally charged overnight," says Xue, of WRI China. Between 2016 and 2020, Shanghai, another electric bus hub, subsidised the electricity e-buses used -- regardless of the hours of the day -- to give them more flexibility in charging.
  • Generous financial support did lead to problems. In 2016, an EV subsidy fraud shook China, with some bus operators found to have exaggerated the number of e-buses they had purchased. So that same year Beijing shifted its EV subsidy rules so bus operators could only receive financial support when a bus's mileage reached 30,000km (19,000 miles).
  • one year later, the government announced the so-called "dual-credit" policy. This allowed new energy vehicle makers to rake up credits which they could sell for cash to those needing to offset "negative credits" generated from making conventional cars.
  • it wasn't only China's buses that had benefitted.China's e-bus campaign helped create a big and stable market for its wider EV industry, brought down the costs and created economies of scale. In 2009, the year the e-bus campaign was rolled out, the total number of new energy vehicles sold stood at 2,300; by 2022, it was 6.9 million, analysis by Huang Zheng,
  • By 2022, the country had also built the world's largest EV charging network, with 1.8 million public charging stations – or two-thirds of the global total – and 3.4 million private equivalents. This means that on average, there is one charging pillar for every 2.5 of China's 13.1 million new energy vehicles.
  • Cold weather is a problem, too, as it can make a battery's charging time longer and its range shorter. The reason China has not achieved 100% electrification for its buses is its northern regions, which have harsh winters, says Xue.
  • To make e-buses truly "green", they should also be charged with renewable power, Wang says. But last year coal power still accounted for 58.4% of China's energy mix, according to the China Electricity Council, a trade body..
  • Globally, however, China is now in a league of its own in uptake of e-buses. By 2018, about 421,000 of the world's 425,000 electric buses were located in China; Europe had about 2,250 and the US owned around 300. A
  • But earlier this year, the European Commission announced a zero-emission target for all new city buses by 2030. And some countries are increasing their overall funding for the transition.
  • In 2020, the European Commission approved Germany's plan to double its aid for e-buses to €650m (£558m/$707m), then again in 2021 to €1.25 billion euros (£1.07m/$1.3bn). And the UK, which last year had the largest electric bus fleet in Europe with 2,226 pure electric and hybrid buses, has announced another £129m ($164m) to help bus operators buy zero-emissions fleets.
  • Countries have thus responded to China's manufacturing lead in divergent ways. "While the US has opted for a more competitive angle by fostering its own e-bus production, regions like Latin America are more open to trade with China due to a more friendly trading setup through [China's] Belt and Road Initiative,"
  • In order to avoid direct competition from Chinese manufacturers, the US has come up with a "school-bus strategy", says Liu. The Chinese don't make the iconic yellow vehicles, so this could ignite American e-bus manufacturing and create a local industry chain, he suggests. Backed by the US Environmental Protection Agency's $5bn (£3.9bn) Clean School Bus Programme, the national effort has so far committed to providing 5,982 buses.
  • In contrast, many Latin American cities, such as the Colombian capital of Bogota and the Chilean capital of Santiago, are greening their traditional bus sectors with the help of Chinese manufacturers, who are the largest providers to the region. In 2020, Chile became the country that had the most Chinese e-buses outside of China, and this year Santiago's public transport operator announced it has ordered 1,022 e-buses from Beijing-based Foton Motor, the biggest overseas deal the firm had received.
  • Chinese manufacturers are likely to receive a lot more orders from Chile and its neighbours in this decade. According to latest research by the global C40 Cities network, the number of electric buses in 32 Latin American cities is expected to increase by more than seven times by 2030, representing an investment opportunity of over $11.3bn (£8.9bn)
  • In June 2023, BloombergNEF forecast half of the world's buses to be entirely battery-powered by 2032, a decade ahead of cars. And by 2026, 36% and 24% of municipal bus sales in Europe and the US, respectively, are expected to be EVs as they begin to catch up with China
  • To meet the global climate goals set by the Paris Agreement, simply switching the world's existing bus fleets might not be enough. According to ITDP, the cumulative greenhouse gas emissions from urban passenger transport globally must stay below the equivalent of 66 gigatonnes CO2 between 2020 and 2050 for the world to meet the 1.5C temperature goal. This emissions limit will only be possible when the world not only adopts electric buses, but goes through a broader shift away from private transport
  • "We can't just focus on [replacing] the buses that exist, we need to actually get many, many more buses on the streets," Thompson adds. She and her team estimate that the world would need about 10 million more buses through 2030, and 46 million more buses cumulatively through 2050, to make public transport good enough to have a shot at achieving the Paris Agreement. And all those buses will need to be electric.
  • In China therefore, even though EVs are being sold faster than ever, its central government has instructed cities to encourage public transport use, as well as walking and riding bikes.
  • In Wang's hometown, meanwhile, which has just over three million residents, the local government has gone one step further and made all bus rides free. All citizens need to do is to swipe an app, with no charge, to get onto the bus. "My aunt loves taking buses now," says Wang. "She says it is so convenient."
Javier E

Our politics isn't designed to protect the public from Covid-19 | George Monbiot | Opin... - 0 views

  • he worst possible people are in charge at the worst possible time. In the UK, the US and Australia, the politics of the governing parties have been built on the dismissal and denial of risk.
  • Just as these politics have delayed the necessary responses to climate breakdown, ecological collapse, air and water pollution, obesity and consumer debt, so they appear to have delayed the effective containment of Covid-19.
  • I believe it is no coincidence that these three governments have responded later than comparable nations have, and with measures that seemed woefully unmatched to the scale of the crisis
  • ...14 more annotations...
  • to have responded promptly and sufficiently would have meant jettisoning an entire structure of political thought developed in these countries over the past half century.
  • Politics is best understood as public relations for particular interests. The interests come first; politics is the means by which they are justified and promoted
  • On the left, the dominant interest groups can be very large – everyone who uses public services, for instance
  • On the right they tend to be much smaller. In the US, the UK and Australia, they are very small indeed: mostly multimillionaires and a very particular group of companies: those whose profits depend on the cavalier treatment of people and planet
  • I’ve seen how the tobacco companies covertly funded an infrastructure of persuasion to deny the impacts of smoking. This infrastructure was then used, often by the same professional lobbyists, to pour doubt on climate science and attack researchers and environmental campaigners.
  • these companies funded rightwing thinktanks and university professors to launch attacks on public health policy in general and create a new narrative of risk, tested on focus groups and honed in the media
  • They reframed responsible government as the “nanny state”, the “health police” and “elf ’n’ safety zealots”. They dismissed scientific findings and predictions as “unfounded fears”, “risk aversion” and “scaremongering”.
  • Public protections were recast as “red tape”, “interference” and “state control”. Government itself was presented as a mortal threat to our freedom.
  • The groups these corporations helped to fund – thinktanks and policy units, lobbyists and political action committees – were then used by other interests: private health companies hoping to break up the NHS, pesticide manufacturers seeking to strike down regulatory controls, junk food manufacturers resisting advertising restrictions, billionaires seeking to avoid tax
  • Between them, these groups refined the justifying ideology for fragmenting and privatising public services, shrinking the state and crippling its ability to govern.
  • Now, in these three nations, this infrastructure is the government. No 10 Downing Street has been filled with people from groups strongly associated with attacks on regulation and state interventio
  • Modern politics is impossible to understand without grasping the pollution paradox. The greater the risk to public health and wellbeing a company presents, the more money it must spend on politics – to ensure it isn’t regulated out of existence. Political spending comes to be dominated by the dirtiest companies
  • The theory on which this form of government is founded can seem plausible and logically consistent. Then reality hits, and we find ourselves in the worst place from which to respond to crisis, with governments that have an ingrained disregard for public safety and a reflexive resort to denial
  • It is what we see today, as the Trump, Johnson and Morrison governments flounder in the face of this pandemic. They are called upon to govern, but they know only that government is the enemy.
Javier E

Pandemic Shoppers Are a Nightmare to Service Workers - The Atlantic - 0 views

  • For generations, American shoppers have been trained to be nightmares. The pandemic has shown just how desperately the consumer class clings to the feeling of being served.
  • The most immediate culprit is decades of cost-cutting; by increasing surveillance and pressure on workers during shifts, reducing their hours and benefits, and not replacing those who quit, executives can shine up a business’s balance sheet in a hurry.
  • Wages and resources dwindle, and more expensive and experienced workers get replaced with fewer and more poorly trained new hires. When customers can’t find anyone to help them or have to wait too long in line, they take it out on whichever overburdened employee they eventually hunt down.
  • ...26 more annotations...
  • as the production of food and material goods centralized and rapidly expanded, commerce reached a scale that the country’s existing stores were ill-equipped to handle, according to the historian Susan Strasser, the author of Satisfaction Guaranteed: The Making of the American Mass Market. Manufacturers needed ways to distribute their newly enormous outputs and educate the public on the wonder of all their novel options. Americans, in short, had to be taught how to shop.
  • In 2019, one in five American workers was employed in retail, food service, or hospitality; even more are now engaged in service work of some kind.
  • This dynamic is exacerbated by the fact that the United States has more service workers than ever before, doing more types of labor, spread thin across the economy
  • Customers might not have been able to afford a household staff to do their bidding like the era’s truly wealthy, but corporate stores offered them a little taste of what that would be like. The middle class began to see itself as the small-time beneficiaries of industrialization’s barons.
  • With these goals in mind, Leach writes, customer service was born. For retailers’ tactics to be successful, consumers—or guests, as department stores of the era took to calling them—needed to feel appreciated and rewarded
  • From 1870 to 1910, the number of service workers in the United States quintupled. It’s from this morass that “The customer is always right” emerged as the essential precept of American consumerism—service workers weren’t there just to ring up orders
  • they were there to fuss and fawn, to bolster egos, to reassure wavering buyers, to make dreams come true.
  • they were also quite intentionally building something far grander: class consciousness. Leach writes that the introduction of shopping was fundamental to forming middle-class identity at a particularly crucial moment, as the technological advances of the Gilded Age helped create the American office worker as we now know it.
  • Retailers won over this growing middle class by convincing its members that they were separate from—and opposed to—industrial workers and their distrust of corporate power,
  • For many of these workers, the difficulty of finding non-service employment enables companies to pay low wages and keep their prices artificially low, which consumers generally like as long as they don’t have to think about what makes it possible. In theory, these conditions are supposed to encourage better performance on the part of the worker; in practice, they also encourage cruelty on the part of the consumer.
  • Previously confined to a few lavish European-owned hotels in America, tipping “aristocratized consumption,
  • Department-store magnates alleviated these concerns by linking department stores to the public good. Retailers started inserting themselves into these communities as much as possible, Leach writes, turning their enormous stores into domains of urban civic life. They hosted free concerts and theatrical performances, offered free child care, displayed fine art, and housed restaurants, tearooms, Turkish baths, medical and dental services, banks, and post offices. They made splashy contributions to local charities and put on holiday parades and fireworks shows. This created the impression that patronizing their stores wouldn’t just be a practical transaction or an individual pleasure, but an act of benevolence toward the orderly society those stores supported.
  • In the 150 years that American consumerism has existed, it has metastasized into almost every way that Americans construct their identities. Today’s brands insert themselves into current events, align themselves with causes, associate patronage of their businesses with virtue and discernment and success.
  • Most Americans now expect corporations to take a stand on contentious social and political issues; in return, corporations have even co-opted some of the language of actual politics, encouraging consumers to “vote with their dollars” for the companies that market themselves on the values closest to their own.
  • For Americans in a socially isolating culture, living under an all but broken political system, the consumer realm is the place where many people can most consistently feel as though they are asserting their agency.
  • Being corrected by a salesperson, forgotten by a bartender, or brushed off by a flight attendant isn’t just an annoyance—for many people, it is an existential threat to their self-understanding.
  • “The notion that at the restaurant, you’re better than the waiters, it becomes part of the restaurant experience,” and also part of how some patrons understand their place in the world. Compounding this sense of superiority is the fact that so many service workers are from historically marginalized groups—the workforce is disproportionately nonwhite and female.
  • Because consumer identities are constructed by external forces, Strasser said, they are uniquely vulnerable, and the people who hold them are uniquely insecure
  • If your self-perception is predicated on how you spend your money, then you have to keep spending it, especially if your overall class status has become precarious, as it has for millions of middle-class people in the past few decades
  • Although underpaid, poorly treated service workers certainly exist around the world, American expectations on their behavior are particularly extreme and widespread, according to Nancy Wong, a consumer psychologist and the chair of the consumer-science department at the University of Wisconsin. “Business is at fault here,” Wong told me. “This whole industry has profited from exploitation of a class of workers that clearly should not be sustainable.”
  • Tipping ratcheted up the level of control that members of the middle class could exercise over the service workers beneath them: Consumers could deny payment—effectively, deny workers their wages—for anything less than complete submission.
  • Modern businesses have invented novel ways to exacerbate conflicts between their customers and their workers.
  • A big problem at airlines and hotels in particular, Wong said, is what’s called the “customer relationship management” model. CRM programs, the first and most famous of which are frequent-flyer miles, are fabulously profitable; awarding points or miles or bucks encourages people not only to increase the size and frequency of their purchases, but also to confine their spending to one airline or hotel chain or big-box store.
  • Higher-spending customers access varying levels of luxury and prestige, often in full view of everyone else. Exposure to these consumer inequalities has been found to spark antisocial behavior in those who don’t get to enjoy their perks, the classic example of which is air rage
  • Workers must do what the sociologist Arlie Russell Hochschild, in her 1983 book, The Managed Heart, identified as “emotional labor.”
  • Workers must stifle their natural emotional reactions to, in the case of those in the service industry, placate members of the consumer class. These workers are alienated from their own emotional well-being, which can have far-reaching psychological consequences—over the years, research has associated this kind of work with elevated levels of stress hormones, burnout, depression, and increased alcohol consumption.
Javier E

Now, With No Further Ado, We Present ... the Digital Public Library of America! - Rebec... - 0 views

  • we will have a platform that others can build upon. All the data will be licensed under CC0 -- that's really a public domain declaration. It means that we're giving away all this data for free for people to use in whatever way they want. And we will have an API -- a very powerful API -- that third-party developers will be able to use to create innovative apps based on the contents of the DPLA. So if you're a developer of a mobile app, maybe one for a local walking tour of a city, you can take the material you already have and mix it up with all the great content from the DPLA for that particular location.
  • We act as the top-level aggregator of all this great material, and the service hubs do an amazing job of normalizing the metadata and bringing in this content from thousands of sites across the United States.
  • act as a very strong advocate for public options for reading and research in the 21st century. We really want to work to expand the realm of publicly available materials. So, obviously, a big part of that is working with non-profit groups like libraries, archives, and museums to get that stuff online and out to the public, but there will also be a component here where I'm going to push, along with my colleagues at the DPLA, to see how we can get other materials into the DPLA and out to the public. It very much has that spirit of the public library. We want to make the maximal amount of content available in a maximally open way.
  • ...3 more annotations...
  • we are in the process of geocoding as much of them as possible, so that they'll work great in those kinds of GPS-based devices and apps. So, that platform is going to be a big part of it and we're hoping to see a lot of partners -- commercial and non-profit -- use that.
  • we're going to provide some really unique services that will help supplement what's going on in public and research libraries. For instance, that map interface is a way to browse the collections. That's not generally an interface that you see on your local public library's website.
  • I hope that the DPLA can act, in some senses, as a market maker. I hope we can bring a huge audience to content, and when that happens, you might have, for instances, authors or publishers becoming very interested in how they might be able to put materials into DPLA to attract new readers and researchers.
Javier E

The Bus Is the Best Public Transit for Cities - The Atlantic - 0 views

  • A city is a place where many people live close together. The problem of urban transportation is a problem of sharing space.
  • When you drive alone (or take Uber alone) in a gridlocked street or freeway, you are taking more than your fair share of the limited space. When stuck in traffic, you are blocking others from moving freely.
  • If cities want to move people faster than walking while allowing them to take up only their fair share of space, two options arise. One is to use a vehicle that’s not much bigger than the human body, such as bicycles and scooters. Those tools work well for certain people in particular circumstances, but not for everyone. The other option is to share the ride in a vehicle. If space is really scarce, that vehicle will have to carry lots of people. In most cases, riders will have to share a vehicle with strangers, people who are not traveling for the same purposes or even to the same places. That’s what public transit is.
  • ...5 more annotations...
  • So what technologies make sense in public transit? Efficient transit networks are made of many technologies, each the right one for its own situation. Rail is for high-capacity markets, where you need to move hundreds of people per vehicle. Ferries and aerial gondolas overcome certain obstacles. But everywhere else, the bus is the thing that’s easiest to make abundant. Because labor is the main limit on their quantity, they can be much more abundant after full automation.
  • And walking is key to it. Out in low-density suburbs, residents can also drive to fixed-transit stops. But in the dense city, there’s no room for that. The microtransit promise of “service to your door” is a promise to abolish walking, and yet walking is the essence of how people share precious space.
  • Fixed public transit deploys large vehicles flowing along a set path, and riders gathering at stops to use them. That way, the vehicles can follow a fairly straight line, and they don’t need to stop once for every customer. That is what makes them worth walking to get to. It is one of the best ideas in the history of transportation
  • If the buses are terrible in your city, you may think that buses are terrible in general. In truth, a city’s bus service is as good as its leaders and voters want it to be. Where voters have funded better bus services and cities have worked to give them priority, as in Seattle, ridership has soared.
  • The starvation of high-ridership public transit in America is a choice, one that Americans don’t have to make. I work in cities all over the developed world, but my U.S. clients always have the poorest transit budgets, requiring the most painful trade-offs. They can’t afford to run the frequent and reliable fixed-route services that would do well, so they are forced to run poor service, yielding low ridership, feeding the impression that transit is pointless
Javier E

Afghanistan Is Your Fault - The Atlantic - 0 views

  • American citizens will separate into their usual camps and identify all of the obvious causes and culprits except for one: themselves.
  • Much of what happened in Korea and Vietnam—ultimately constituting a tie and a loss, if we are to be accurate—was beyond the control of the American public. Boys were drafted and sent into battle, sometimes in missions never intended to be revealed to the public.
  • Afghanistan was different. This was a war that was immensely popular at the outset and mostly conducted in full view of the American public.
  • ...14 more annotations...
  • The problem was that, once the initial euphoria wore off, the public wasn’t much interested in it. Coverage in print media remained solid, but cable-news coverage of Afghanistan dropped off quickly, especially once
  • “America’s not at war” was a common refrain among the troops. “We’re at war. America’s at the mall.”
  • now those same Americans have the full withdrawal from Afghanistan they apparently want: Some 70 percent of the public supports a pullout.
  • Not that they care that intensely about it; as the foreign-policy scholar Stephen Biddle recently observed, the war is practically an afterthought in U.S. politics. “You would need an electron microscope to detect the effect of Afghanistan on any congressional race in the last decade,” Biddle said early this year. “It’s been invisible.”
  • What the public does care about, however, is using Afghanistan as raw material for cheap patriotism and partisan attacks (some right and some wrong, but few of them in good faith) on every president since 2001.
  • nor did they want to think about whether “draining the swamp” and modernizing and developing Afghanistan (which would mean a lot more than a few elections) was worth the cost and effort.
  • Maybe it would have been worth it. Or maybe such a project was impossible. We’ll never know for certain, because American political and military leaders only tried pieces of several strategies, never a coherent whole, mostly to keep the costs and casualties down and to keep the war off the front pages and away from a public that didn’t want to hear about it
  • Nor did Americans ever consider whether or when Afghanistan, as a source of terrorist threats to the U.S., had been effectively neutralized. Nothing is perfect, and risks are never zero. But there was no time at which we all decided that “close enough” was good enough, and that we’d rather come home than stay.
  • Biden’s policy, of course, is not that different from Trump’s, despite all the partisan howling about it from Republicans. As my colleague David Frum has put it: “For good or ill, the Biden policy on Afghanistan is the same as the Trump policy, only with less lying.”
  • But as comforting as it would be to blame Obama and Trump, we must look inward and admit that we told our elected leaders—of both parties—that they were facing a no-win political test. If they chose to leave, they would be cowards who abandoned Afghanistan. If they chose to stay, they were warmongers intent on pursuing “forever war.”
  • A serious people—the kind of people we once were—would have made serious choices, long before this current debacle was upon them. They would today be trying to learn something from nearly 2,500 dead service members and many more wounded
  • Biden was right, in the end, to bite the bullet and refuse to pass this conflict on to yet another president
  • His execution of this resolve, however, looks to be a tragic and shameful mess and will likely be a case study in policy schools for years to come. But there was no version of “Stop the forever war” that didn’t end with the fall of Kabul
  • before we move on, before we head back to the mall, before we resume posting memes, and before we return to bickering with each other about whether we should have to mask up at Starbuck’s, let us remember that this day came about for one reason, and one reason only.Because it is what we wanted.
Javier E

How Donald Trump Could Build an Autocracy in the U.S. - The Atlantic - 0 views

  • Everything imagined above—and everything described below—is possible only if many people other than Donald Trump agree to permit it. It can all be stopped, if individual citizens and public officials make the right choices. The story told here, like that told by Charles Dickens’s Ghost of Christmas Yet to Come, is a story not of things that will be, but of things that may be. Other paths remain open. It is up to Americans to decide which one the country will follow.
  • What is spreading today is repressive kleptocracy, led by rulers motivated by greed rather than by the deranged idealism of Hitler or Stalin or Mao. Such rulers rely less on terror and more on rule-twisting, the manipulation of information, and the co-optation of elites.
  • the American system is also perforated by vulnerabilities no less dangerous for being so familiar. Supreme among those vulnerabilities is reliance on the personal qualities of the man or woman who wields the awesome powers of the presidency.
  • ...51 more annotations...
  • The president of the United States, on the other hand, is restrained first and foremost by his own ethics and public spirit. What happens if somebody comes to the high office lacking those qualities?
  • Donald Trump, however, represents something much more radical. A president who plausibly owes his office at least in part to a clandestine intervention by a hostile foreign intelligence service? Who uses the bully pulpit to target individual critics? Who creates blind trusts that are not blind, invites his children to commingle private and public business, and somehow gets the unhappy members of his own political party either to endorse his choices or shrug them off? If this were happening in Honduras, we’d know what to call it. It’s happening here instead, and so we are baffled.
  • As politics has become polarized, Congress has increasingly become a check only on presidents of the opposite party. Recent presidents enjoying a same-party majority in Congress—Barack Obama in 2009 and 2010, George W. Bush from 2003 through 2006—usually got their way.
  • Trump has scant interest in congressional Republicans’ ideas, does not share their ideology, and cares little for their fate. He can—and would—break faith with them in an instant to further his own interests. Yet here they are, on the verge of achieving everything they have hoped to achieve for years, if not decades. They owe this chance solely to Trump’s ability to deliver a crucial margin of votes in a handful of states—Wisconsin, Michigan, and Pennsylvania—which has provided a party that cannot win the national popular vote a fleeting opportunity to act as a decisive national majority.
  • What excites Trump is his approval rating, his wealth, his power. The day could come when those ends would be better served by jettisoning the institutional Republican Party in favor of an ad hoc populist coalition, joining nationalism to generous social spending—a mix that’s worked well for authoritarians in places like Poland.
  • A scandal involving the president could likewise wreck everything that Republican congressional leaders have waited years to accomplish. However deftly they manage everything else, they cannot prevent such a scandal. But there is one thing they can do: their utmost not to find out about it.
  • Ryan has learned his prudence the hard way. Following the airing of Trump’s past comments, caught on tape, about his forceful sexual advances on women, Ryan said he’d no longer campaign for Trump. Ryan’s net favorability rating among Republicans dropped by 28 points in less than 10 days. Once unassailable in the party, he suddenly found himself disliked by 45 percent of Republicans.
  • Ambition will counteract ambition only until ambition discovers that conformity serves its goals better. At that time, Congress, the body expected to check presidential power, may become the president’s most potent enabler.
  • Discipline within the congressional ranks will be strictly enforced not only by the party leadership and party donors, but also by the overwhelming influence of Fox News.
  • Fox learned its lesson: Trump sells; critical coverage does not. Since the election, the network has awarded Kelly’s former 9 p.m. time slot to Tucker Carlson, who is positioning himself as a Trump enthusiast in the Hannity mold.
  • Gingrich said: The president “has, frankly, the power of the pardon. It is a totally open power, and he could simply say, ‘Look, I want them to be my advisers. I pardon them if anybody finds them to have behaved against the rules. Period.’ And technically, under the Constitution, he has that level of authority.”
  • In 2009, in the run-up to the Tea Party insurgency, South Carolina’s Bob Inglis crossed Fox, criticizing Glenn Beck and telling people at a town-hall meeting that they should turn his show off. He was drowned out by booing, and the following year, he lost his primary with only 29 percent of the vote, a crushing repudiation for an incumbent untouched by any scandal.
  • Fox is reinforced by a carrier fleet of supplementary institutions: super pacs, think tanks, and conservative web and social-media presences, which now include such former pariahs as Breitbart and Alex Jones. So long as the carrier fleet coheres—and unless public opinion turns sharply against the president—oversight of Trump by the Republican congressional majority will very likely be cautious, conditional, and limited.
  • His immediate priority seems likely to be to use the presidency to enrich himself. But as he does so, he will need to protect himself from legal risk. Being Trump, he will also inevitably wish to inflict payback on his critics. Construction of an apparatus of impunity and revenge will begin haphazardly and opportunistically. But it will accelerate. It will have to.
  • By filling the media space with bizarre inventions and brazen denials, purveyors of fake news hope to mobilize potential supporters with righteous wrath—and to demoralize potential opponents by nurturing the idea that everybody lies and nothing matters
  • The United States may be a nation of laws, but the proper functioning of the law depends upon the competence and integrity of those charged with executing it. A president determined to thwart the law in order to protect himself and those in his circle has many means to do so.
  • The powers of appointment and removal are another. The president appoints and can remove the commissioner of the IRS. He appoints and can remove the inspectors general who oversee the internal workings of the Cabinet departments and major agencies. He appoints and can remove the 93 U.S. attorneys, who have the power to initiate and to end federal prosecutions. He appoints and can remove the attorney general, the deputy attorney general, and the head of the criminal division at the Department of Justice.
  • Republicans in Congress have long advocated reforms to expedite the firing of underperforming civil servants. In the abstract, there’s much to recommend this idea. If reform is dramatic and happens in the next two years, however, the balance of power between the political and the professional elements of the federal government will shift, decisively, at precisely the moment when the political elements are most aggressive. The intelligence agencies in particular would likely find themselves exposed to retribution from a president enraged at them for reporting on Russia’s aid to his election campaign.
  • The McDonnells had been convicted on a combined 20 counts.
  • The Supreme Court objected, however, that the lower courts had interpreted federal anticorruption law too broadly. The relevant statute applied only to “official acts.” The Court defined such acts very strictly, and held that “setting up a meeting, talking to another official, or organizing an event—without more—does not fit that definition of an ‘official act.’ ”
  • Trump is poised to mingle business and government with an audacity and on a scale more reminiscent of a leader in a post-Soviet republic than anything ever before seen in the United States.
  • Trump will try hard during his presidency to create an atmosphere of personal munificence, in which graft does not matter, because rules and institutions do not matter. He will want to associate economic benefit with personal favor. He will create personal constituencies, and implicate other people in his corruption.
  • You would never know from Trump’s words that the average number of felonious killings of police during the Obama administration’s tenure was almost one-third lower than it was in the early 1990s, a decline that tracked with the general fall in violent crime that has so blessed American society. There had been a rise in killings of police in 2014 and 2015 from the all-time low in 2013—but only back to the 2012 level. Not every year will be the best on record.
  • A mistaken belief that crime is spiraling out of control—that terrorists roam at large in America and that police are regularly gunned down—represents a considerable political asset for Donald Trump. Seventy-eight percent of Trump voters believed that crime had worsened during the Obama years.
  • From the point of view of the typical Republican member of Congress, Fox remains all-powerful: the single most important source of visibility and affirmation with the voters whom a Republican politician cares about
  • Civil unrest will not be a problem for the Trump presidency. It will be a resource. Trump will likely want not to repress it, but to publicize it—and the conservative entertainment-outrage complex will eagerly assist him
  • Immigration protesters marching with Mexican flags; Black Lives Matter demonstrators bearing antipolice slogans—these are the images of the opposition that Trump will wish his supporters to see. The more offensively the protesters behave, the more pleased Trump will be.
  • If there is harsh law enforcement by the Trump administration, it will benefit the president not to the extent that it quashes unrest, but to the extent that it enflames more of it, ratifying the apocalyptic vision that haunted his speech at the convention.
  • In the early days of the Trump transition, Nic Dawes, a journalist who has worked in South Africa, delivered an ominous warning to the American media about what to expect. “Get used to being stigmatized as ‘opposition,’ ” he wrote. “The basic idea is simple: to delegitimize accountability journalism by framing it as partisan.”
  • Mostly, however, modern strongmen seek merely to discredit journalism as an institution, by denying that such a thing as independent judgment can exist. All reporting serves an agenda. There is no truth, only competing attempts to grab power.
  • In true police states, surveillance and repression sustain the power of the authorities. But that’s not how power is gained and sustained in backsliding democracies. Polarization, not persecution, enables the modern illiberal regime.
  • A would-be kleptocrat is actually better served by spreading cynicism than by deceiving followers with false beliefs: Believers can be disillusioned; people who expect to hear only lies can hardly complain when a lie is exposed.
  • The inculcation of cynicism breaks down the distinction between those forms of media that try their imperfect best to report the truth, and those that purvey falsehoods for reasons of profit or ideology. The New York Times becomes the equivalent of Russia’s RT; The Washington Post of Breitbart; NPR of Infowars.
  • Trump had not a smidgen of evidence beyond his own bruised feelings and internet flotsam from flagrantly unreliable sources. Yet once the president-elect lent his prestige to the crazy claim, it became fact for many people. A survey by YouGov found that by December 1, 43 percent of Republicans accepted the claim that millions of people had voted illegally in 2016.
  • A clear untruth had suddenly become a contested possibility. When CNN’s Jeff Zeleny correctly reported on November 28 that Trump’s tweet was baseless, Fox’s Sean Hannity accused Zeleny of media bias—and then proceeded to urge the incoming Trump administration to take a new tack with the White House press corps, and to punish reporters like Zeleny.
  • the whipping-up of potentially violent Twitter mobs against media critics is already a standard method of Trump’s governance.
  • I’ve talked with well-funded Trump supporters who speak of recruiting a troll army explicitly modeled on those used by Turkey’s Recep Tayyip Erdoğan and Russia’s Putin to take control of the social-media space, intimidating some critics and overwhelming others through a blizzard of doubt-casting and misinformation.
  • he and his team are serving notice that a new era in government-media relations is coming, an era in which all criticism is by definition oppositional—and all critics are to be treated as enemies.
  • “Lying is the message,” she wrote. “It’s not just that both Putin and Trump lie, it is that they lie in the same way and for the same purpose: blatantly, to assert power over truth itself.”
  • lurid mass movements of the 20th century—communist, fascist, and other—have bequeathed to our imaginations an outdated image of what 21st-century authoritarianism might look like.
  • In a society where few people walk to work, why mobilize young men in matching shirts to command the streets? If you’re seeking to domineer and bully, you want your storm troopers to go online, where the more important traffic is. Demagogues need no longer stand erect for hours orating into a radio microphone. Tweet lies from a smartphone instead.
  • “Populist-fueled democratic backsliding is difficult to counter,” wrote the political scientists Andrea Kendall-Taylor and Erica Frantz late last year. “Because it is subtle and incremental, there is no single moment that triggers widespread resistance or creates a focal point around which an opposition can coalesce … Piecemeal democratic erosion, therefore, typically provokes only fragmented resistance.”
  • If people retreat into private life, if critics grow quieter, if cynicism becomes endemic, the corruption will slowly become more brazen, the intimidation of opponents stronger. Laws intended to ensure accountability or prevent graft or protect civil liberties will be weakened.
  • If the president uses his office to grab billions for himself and his family, his supporters will feel empowered to take millions. If he successfully exerts power to punish enemies, his successors will emulate his methods.
  • If citizens learn that success in business or in public service depends on the favor of the president and his ruling clique, then it’s not only American politics that will change. The economy will be corrupted too, and with it the larger cultur
  • A culture that has accepted that graft is the norm, that rules don’t matter as much as relationships with those in power, and that people can be punished for speech and acts that remain theoretically legal—such a culture is not easily reoriented back to constitutionalism, freedom, and public integrity.
  • The oft-debated question “Is Donald Trump a fascist?” is not easy to answer. There are certainly fascistic elements to him: the subdivision of society into categories of friend and foe; the boastful virility and the delight in violence; the vision of life as a struggle for dominance that only some can win, and that others must lose.
  • He is so pathetically needy, so shamelessly self-interested, so fitful and distracted. Fascism fetishizes hardihood, sacrifice, and struggle—concepts not often associated with Trump.
  • Perhaps the better question about Trump is not “What is he?” but “What will he do to us?”
  • By all early indications, the Trump presidency will corrode public integrity and the rule of law—and also do untold damage to American global leadership, the Western alliance, and democratic norms around the world
  • The damage has already begun, and it will not be soon or easily undone. Yet exactly how much damage is allowed to be done is an open question—the most important near-term question in American politics. It is also an intensely personal one, for its answer will be determined by the answer to another question: What will you do?
Javier E

Economics of Good and Evil: The Quest for Economic Meaning from Gilgamesh to Wall Stree... - 2 views

  • Instead of self-confident and self-centered answers, the author humbly asks fundamental questions: What is economics? What is its meaning? Where does this new religion, as it is sometimes called, come from? What are its possibilities and its limitations and borders, if there are any? Why are we so dependent on permanent growing of growth and growth of growing of growth? Where did the idea of progress come from, and where is it leading us? Why are so many economic debates accompanied by obsession and fanaticism?
  • The majority of our political parties act with a narrow materialistic focus when, in their programs, they present the economy and finance first; only then, somewhere at the end, do we find culture as something pasted on or as a libation for a couple of madmen.
  • most of them—consciously or unconsciously—accept and spread the Marxist thesis of the economic base and the spiritual superstructure.
  • ...297 more annotations...
  • He tries to break free of narrow specialization and cross the boundaries between scientific disciplines. Expeditions beyond economics’ borders and its connection to history, philosophy, psychology, and ancient myths are not only refreshing, but necessary for understanding the world of the twenty-first century.
  • Reality is spun from stories, not from material. Zdeněk Neubauer
  • “The separation between the history of a science, its philosophy, and the science itself dissolves into thin air, and so does the separation between science and non-science; differences between the scientific and unscientific are vanishing.”
  • Outside of our history, we have nothing more.
  • The study of the history of a certain field is not, as is commonly held, a useless display of its blind alleys or a collection of the field’s trials and errors (until we got it right), but history is the fullest possible scope of study of a menu that the given field can offer.
  • History of thought helps us to get rid of the intellectual brainwashing of the age, to see through the intellectual fashion of the day, and to take a couple of steps back.
  • Almost all of the key concepts by which economics operates, both consciously and unconsciously, have a long history, and their roots extend predominantly outside the range of economics, and often completely beyond that of science.
  • That is the reason for this book: to look for economic thought in ancient myths and, vice versa, to look for myths in today’s economics.
  • stories; Adam Smith believed. As he puts it in The Theory of Moral Sentiments, “the desire of being believed, or the desire of persuading, of leading and directing other people, seems to be one of the strongest of all our natural desires.”
  • “The human mind is built to think in terms of narratives … in turn, much of human motivation comes from living through a story of our lives, a story that we tell to ourselves and that creates a framework of our motivation. Life could be just ‘one damn thing after another’ if it weren’t for such stories. The same is true for confidence in a nation, a company, or an institution. Great leaders are foremost creators of stories.”
  • contrary to what our textbooks say, economics is predominantly a normative field. Economics not only describes the world but is frequently about how the world should be (it should be effective, we have an ideal of perfect competition, an ideal of high-GDP growth in low inflation, the effort to achieve high competitiveness …). To this end, we create models, modern parables,
  • I will try to show that mathematics, models, equations, and statistics are just the tip of the iceberg of economics; that the biggest part of the iceberg of economic knowledge consists of everything else; and that disputes in economics are rather a battle of stories and various metanarratives than anything else.
  • Before it was emancipated as a field, economics lived happily within subsets of philosophy—ethics, for example—miles away from today’s concept of economics as a mathematical-allocative science that views “soft sciences” with a scorn born from positivistic arrogance. But our thousand-year “education” is built on a deeper, broader, and oftentimes more solid base. It is worth knowing about.
  • is a paradox that a field that primarily studies values wants to be value-free. One more paradox is this: A field that believes in the invisible hand of the market wants to be without mysteries.
  • mathematics at the core of economics, or is it just the icing of the cake, the tip of the iceberg of our field’s inquiry?
  • we seek to chart the development of the economic ethos. We ask questions that come before any economic thinking can begin—both philosophically and, to a degree, historically. The area here lies at the very borders of economics—and often beyond. We may refer to this as protoeconomics (to borrow a term from protosociology) or, perhaps more fittingly, metaeconomics (to borrow a term from metaphysics).
  • In this sense, “the study of economics is too narrow and too fragmentary to lead to valid insight, unless complemented and completed by a study of metaeconomics.”17
  • The more important elements of a culture or field of inquiry such as economics are found in fundamental assumptions that adherents of all the various systems within the epoch unconsciously presuppose. Such assumptions appear so obvious that people do not know what they are assuming, because no other way of putting things has ever occurred to them, as the philosopher Alfred Whitehead notes in Adventures of Ideas.
  • I argue that economic questions were with mankind long before Adam Smith. I argue that the search for values in economics did not start with Adam Smith but culminated with him.
  • We should go beyond economics and study what beliefs are “behind the scenes,” ideas that have often become the dominant yet unspoken assumptions in our theories. Economics is surprisingly full of tautologies that economists are predominantly unaware of. I
  • argue that economics should seek, discover, and talk about its own values, although we have been taught that economics is a value-free science. I argue that none of this is true and that there is more religion, myth, and archetype in economics than there is mathematics.
  • In a way, this is a study of the evolution of both homo economicus and, more importantly, the history of the animal spirits within him. This book tries to study the evolution of the rational as well as the emotional and irrational side of human beings.
  • I argue that his most influential contribution to economics was ethical. His other thoughts had been clearly expressed long before him, whether on specialization, or on the principle of the invisible hand of the market. I try to show that the principle of the invisible hand of the market is much more ancient and developed long before Adam Smith. Traces of it appear even in the Epic of Gilgamesh, Hebrew thought, and in Christianity, and it is expressly stated by Aristophanes and Thomas Aquinas.
  • This is not a book on the thorough history of economic thought. The author aims instead to supplement certain chapters on the history of economic thought with a broader perspective and analysis of the influences that often escape the notice of economists and the wider public.
  • Progress (Naturalness and Civilization)
  • The Economy of Good and Evil
  • from his beginnings, man has been marked as a naturally unnatural creature, who for unique reasons surrounds himself with external possessions. Insatiability, both material and spiritual, are basic human metacharacteristics, which appear as early as the oldest myths and stories.
  • the Hebrews, with linear time, and later the Christians gave us the ideal (or amplified the Hebrew ideal) we now embrace. Then the classical economists secularized progress. How did we come to today’s progression of progress, and growth for growth’s sake?
  • The Need for Greed: The History of Consumption and Labor
  • Metamathematics From where did economics get the concept of numbers as the very foundation of the world?
  • All of economics is, in the end, economics of good and evil. It is the telling of stories by people of people to people. Even the most sophisticated mathematical model is, de facto, a story, a parable, our effort to (rationally) grasp the world around us.
  • idea that we can manage to utilize our natural egoism, and that this evil is good for something, is an ancient philosophical and mythical concept. We will also look into the development of the ethos of homo economicus, the birth of “economic man.”
  • The History of Animal Spirits: Dreams Never Sleep
  • Masters of the Truth
  • Originally, truth was a domain of poems and stories, but today we perceive truth as something much more scientific, mathematical. Where does one go (to shop) for the truth? And who “has the truth” in our epoch?
  • Our animal spirits (something of a counterpart to rationality) are influenced by the archetype of the hero and our concept of what is good.
  • The entire history of ethics has been ruled by an effort to create a formula for the ethical rules of behavior. In the final chapter we will show the tautology of Max Utility, and we will discuss the concept of Max Good.
  • The History of the Invisible Hand of the Market and Homo Economicus
  • We understand “economics” to mean a broader field than just the production, distribution, and consumption of goods and services. We consider economics to be the study of human relations that are sometimes expressible in numbers, a study that deals with tradables, but one that also deals with nontradables (friendship, freedom, efficiency, growth).
  • When we mention economics in this book, we mean the mainstream perception of it, perhaps as best represented by Paul Samuelson.
  • By the term homo economicus, we mean the primary concept of economic anthropology. It comes from the concept of a rational individual, who, led by narrowly egotistical motives, sets out to maximize his benefit.
  • the Epic of Gilgamesh bears witness to the opposite—despite the fact that the first written clay fragments (such as notes and bookkeeping) of our ancestors may have been about business and war, the first written story is mainly about great friendship and adventure.
  • there is no mention of either money or war; for example, not once does anyone in the whole epic sell or purchase something.5 No nation conquers another, and we do not encounter a mention even of the threat of violence.
  • Gilgamesh becomes a hero not only due to his strength, but also due to discoveries and deeds whose importance were in large part economic—direct gaining of construction materials in the case of felling the cedar forest, stopping Enkidu from devastating Uruk’s economy, and discovering new desert routes during his expeditions.
  • Even today we live in Gilgamesh’s vision that human relations—and therefore humanity itself—are a disturbance to work and efficiency; that people would perform better if they did not “waste” their time and energy on nonproductive things.
  • is a story of nature and civilization, of heroism, defiance, and the battle against the gods, and evil; an epic about wisdom, immortality, and also futility.
  • But labour is unlike any other commodity. The work environment is of no concern for steel; we do not care about steel’s well-being.16
  • But it is in friendship where—often by-the-way, as a side product, an externality—ideas and deeds are frequently performed or created that together can altogether change the face of society.19 Friendship can go against an ingrained system in places where an individual does not have the courage to do so himself or herself.
  • As Joseph Stiglitz says, One of the great “tricks” (some say “insights”) of neoclassical economics is to treat labour like any other factor of production. Output is written as a function of inputs—steel, machines, and labour. The mathematics treats labour like any other commodity, lulling one into thinking of labour like an ordinary commodity, such as steel or plastic.
  • Even the earliest cultures were aware of the value of cooperation on the working level—today we call this collegiality, fellowship, or, if you want to use a desecrated term, comradeship. These “lesser relationships” are useful and necessary for society and for companies because work can be done much faster and more effectively if people get along with each other on a human level
  • But true friendship, which becomes one of the central themes of the Epic of Gilgamesh, comes from completely different material than teamwork. Friendship, as C. S. Lewis accurately describes it, is completely uneconomical, unbiological, unnecessary for civilization, and an unneeded relationship
  • Here we have a beautiful example of the power of friendship, one that knows how to transform (or break down) a system and change a person. Enkidu, sent to Gilgamesh as a punishment from the gods, in the end becomes his faithful friend, and together they set out against the gods. Gilgamesh would never have gathered the courage to do something like that on his own—nor would Enkidu.
  • Due to their friendship, Gilgamesh and Enkidu then intend to stand up to the gods themselves and turn a holy tree into mere (construction) material they can handle almost freely, thereby making it a part of the city-construct, part of the building material of civilization, thus “enslaving” that which originally was part of wild nature. This is a beautiful proto-example of the shifting of the borders between the sacred and profane (secular)—and to a certain extent also an early illustration of the idea that nature is there to provide cities and people with raw material and production resources.
  • started with Babylonians—rural nature becomes just a supplier of raw materials, resources (and humans the source of human resources). Nature is not the garden in which humans were created and placed, which they should care for and which they should reside in, but becomes a mere reservoir for natural (re)sources.
  • Even today, we often consider the domain of humanity (human relations, love, friendship, beauty, art, etc.) to be unproductive;
  • Both heroes change—each from opposite poles—into humans. In this context, a psychological dimension to the story may be useful: “Enkidu (…) is Gilgamesh’s alter ego, the dark, animal side of his soul, the complement to his restless heart. When Gilgamesh found Enkidu, he changed from a hated tyrant into the protector of his city. (…)
  • To be human seems to be somewhere in between, or both of these two. We
  • this moment of rebirth from an animal to a human state, the world’s oldest preserved epic implicitly hints at something highly important. Here we see what early cultures considered the beginning of civilization. Here is depicted the difference between people and animals or, better, savages. Here the epic quietly describes birth, the awakening of a conscious, civilized human. We are witnesses to the emancipation of humanity from animals,
  • The entire history of culture is dominated by an effort to become as independent as possible from the whims of nature.39 The more developed a civilization is, the more an individual is protected from nature and natural influences and knows how to create around him a constant or controllable environment to his liking.
  • The price we pay for independence from the whims of nature is dependence on our societies and civilizations. The more sophisticated a given society is as a whole, the less its members are able to survive on their own as individuals, without society.
  • The epic captures one of the greatest leaps in the development of the division of labor. Uruk itself is one of the oldest cities of all, and in the epic it reflects a historic step forward in specialization—in the direction of a new social city arrangement. Because of the city wall, people in the city can devote themselves to things other than worrying about their own safety, and they can continue to specialize more deeply.
  • Human life in the city gains a new dimension and suddenly it seems more natural to take up issues going beyond the life span of an individual. “The city wall symbolizes as well as founds the permanence of the city as an institution which will remain forever and give its inhabitants the certainty of unlimited safety, allowing them to start investing with an outlook reaching far beyond the borders of individual life.
  • The wall around the city of Uruk is, among other things, a symbol of an internal distancing from nature, a symbol of revolts against submission to laws that do not come under the control of man and that man can at most discover and use to his benefit.
  • “The chief thing which the common-sense individual wants is not satisfactions for the wants he had, but more, and better wants.”47
  • If a consumer buys something, theoretically it should rid him of one of his needs—and the aggregate of things they need should be decreased by one item. In reality, though, the aggregate of “I want to have” expands together with the growing aggregate of “I have.”
  • can be said that Enkidu was therefore happy in his natural state, because all of his needs were satiated. On the other hand, with people, it appears that the more a person has, the more developed and richer, the greater the number of his needs (including the unsaturated ones).
  • the Old Testament, this relationship is perceived completely differently. Man (humanity) is created in nature, in a garden. Man was supposed to care for the Garden of Eden and live in harmony with nature and the animals. Soon after creation, man walks naked and is not ashamed, de facto the same as the animals. What is characteristic is that man dresses (the natural state of creation itself is not enough for him), and he (literally and figuratively) covers52 himself—in shame after the fall.53
  • Nature is where one goes to hunt, collect crops, or gather the harvest. It is perceived as the saturator of our needs and nothing more. One goes back to the city to sleep and be “human.” On the contrary, evil resides in nature. Humbaba lives in the cedar forest, which also happens to be the reason to completely eradicate it.
  • Symbolically, then, we can view the entire issue from the standpoint of the epic in the following way: Our nature is insufficient, bad, evil, and good (humane) occurs only after emancipation from nature (from naturalness), through culturing and education. Humanity is considered as being in civilization.
  • The city was frequently (at least in older Jewish writings) a symbol of sin, degeneration, and decadence—nonhumanity. The Hebrews were originally a nomadic nation, one that avoided cities. It is no accident that the first important city57 mentioned in the Bible is proud Babylon,58 which God later turns to dust.
  • is enough, for example, to read the Book of Revelation to see how the vision of paradise developed from the deep Old Testament period, when paradise was a garden. John describes his vision of heaven as a city—paradise is in New Jerusalem, a city where the dimensions of the walls(!) are described in detail, as are the golden streets and gates of pearl.
  • Hebrews later also chose a king (despite the unanimous opposition of God’s prophets) and settled in cities, where they eventually founded the Lord’s Tabernacle and built a temple for Him. The city of Jerusalem later gained an illustrious position in all of religion.
  • this time Christianity (as well as the influence of the Greeks) does not consider human naturalness to be an unambiguous good, and it does not have such an idyllic relationship to nature as the Old Testament prophets.
  • If a tendency toward good is not naturally endowed in people, it must be imputed from above through violence or at least the threat of violence.
  • If we were to look at human naturalness as a good, then collective social actions need a much weaker ruling hand. If people themselves have a natural tendency (propensity) toward good, this role does not have to be supplied by the state, ruler, or, if you wish, Leviathan.
  • How does this affect economics?
  • us return for the last time to the humanization of the wild Enkidu, which is a process we can perceive with a bit of imagination as the first seed of the principle of the market’s invisible hand, and therefore the parallels with one of the central schematics of economic thinking.
  • Sometimes it is better to “harness the devil to the plow” than to fight with him. Instead of summoning up enormous energy in the fight against evil, it is better to use its own energy to reach a goal we desire; setting up a mill on the turbulent river instead of futile efforts to remove the current. This is also how Saint Prokop approached it in one of the oldest Czech legends.
  • Enkidu caused damage and it was impossible to fight against him. But with the help of a trap, trick, this evil was transformed into something that greatly benefited civilization.
  • By culturing and “domesticating” Enkidu, humanity tamed the uncontrollable wild and chaotic evil
  • Enkidu devastated the doings (the external, outside-the-walls) of the city. But he was later harnessed and fights at the side of civilization against nature, naturalness, the natural state of things.
  • A similar motif appears a thousand years after the reversal, which is well known even to noneconomists as the central idea of economics: the invisible hand of the market.
  • A similar story (reforming something animally wild and uncultivated in civilizational achievement) is used by Thomas Aquinas in his teachings. Several centuries later, this idea is fully emancipated in the hands of Bernard Mandeville and his Fable of the Bees: or, Private Vices, Publick Benefits. The economic and political aspects of this idea are—often incorrectly—ascribed to Adam Smith.
  • Here the individual does not try anymore to maximize his goods or profits, but what is important is writing his name in human memory in the form of heroic acts or deeds.
  • immortality, one connected with letters and the cult of the word: A name and especially a written name survives the body.”77
  • After this disappointment, he comes to the edge of the sea, where the innkeeper Siduri lives. As tonic for his sorrow, she offers him the garden of bliss, a sort of hedonistic fortress of carpe diem, where a person comes to terms with his mortality and at least in the course of the end of his life maximizes earthly pleasures, or earthly utility.
  • In the second stage, after finding his friend Enkidu, Gilgamesh abandons the wall and sets out beyond the city to maximalize heroism. “In his (…) search of immortal life, Gilgamesh
  • The hero refuses hedonism in the sense of maximizing terrestrial pleasure and throws himself into things that will exceed his life. In the blink of an eye, the epic turns on its head the entire utility maximization role that mainstream economics has tirelessly tried to sew on people as a part of their nature.81
  • It is simpler to observe the main features of our civilization at a time when the picture was more readable—at a time when our civilization was just being born and was still “half-naked.” In other words, we have tried to dig down to the bedrock of our written civilization;
  • today remember Gilgamesh for his story of heroic friendship with Enkidu, not for his wall, which no longer reaches monumental heights.
  • the eleventh and final tablet, Gilgamesh again loses what he sought. Like Sisyphus, he misses his goal just before the climax
  • is there something from it that is valid today? Have we found in Gilgamesh certain archetypes that are in us to this day?
  • The very existence of questions similar to today’s economic ones can be considered as the first observation. The first written considerations of the people of that time were not so different from those today. In other words: The epic is understandable for us, and we can identify with it.
  • We have also been witnesses to the very beginnings of man’s culturing—a great drama based on a liberation and then a distancing from the natural state.
  • Let us take this as a memento in the direction of our restlessness, our inherited dissatisfaction and the volatility connected to it. Considering that they have lasted five thousand years and to this day we find ourselves in harmony with a certain feeling of futility, perhaps these characteristics are inherent in man.
  • Gilgamesh had a wall built that divided the city from wild nature and created a space for the first human culture. Nevertheless, “not even far-reaching works of civilization could satisfy human desire.”
  • Friendship shows us new, unsuspected adventures, gives us the opportunity to leave the wall and to become neither its builder nor its part—to not be another brick in the wall.
  • with the phenomenon of the creation of the city, we have seen how specialization and the accumulation of wealth was born, how holy nature was transformed into a secular supplier of resources, and also how humans’ individualistic ego was emancipated.
  • to change the system, to break down that which is standing and go on an expedition against the gods (to awaken, from naïveté to awakening) requires friendship.
  • For small acts (hunting together, work in a factory), small love is enough: Camaraderie. For great acts, however, great love is necessary, real love: Friendship. Friendship that eludes the economic understanding of quid pro quo. Friendship gives. One friend gives (fully) for the other. That is friendship for life and death,
  • The thought that humanity comes at the expense of efficiency is just as old as humanity itself—as we have shown, subjects without emotion are the ideal of many tyrants.
  • The epic later crashes this idea through the friendship of Gilgamesh and Enkidu. Friendship—the biologically least essential love, which at first sight appears to be unnecessary
  • less a civilized, city person is dependent on nature, the more he or she is dependent on the rest of society. Like Enkidu, we have exchanged nature for society; harmony with (incalculable) nature for harmony with (incalculable) man.
  • human nature good or evil? To this day these questions are key for economic policy: If we believe that man is evil in his nature, therefore that a person himself is dog eat dog (animal), then the hard hand of a ruler is called for. If we believe that people in and of themselves, in their nature, gravitate toward good, then it is possible to loosen up the reins and live in a society that is more laissez-faire.
  • For a concept of historical progress, for the undeification of heroes, rulers, and nature, mankind had to wait for the Hebrews.
  • Because nature is not undeified, it is beyond consideration to explore it, let alone intervene in it (unless a person was a two-thirds god like Gilgamesh). It
  • They practiced money lending, traded in many assets (…) and especially were engaged in the trading of shares on capital markets, worked in currency exchange and frequently figured as mediators in financial transactions (…), they functioned as bankers and participated in emissions of all possible forms.
  • As regards modern capitalism (as opposed to the ancient and medieval periods) … there are activities in it which are, in certain forms, inherently (and completely necessarily) present—both from an economic and legal standpoint.7
  • As early as the “dark” ages, the Jews commonly used economic tools that were in many ways ahead of their time and that later became key elements of the modern economy:
  • Gilgamesh’s story ends where it began. There is a consistency in this with Greek myths and fables: At the end of the story, no progress occurs, no essential historic change; the story is set in indefinite time, something of a temporal limbo.
  • Jews believe in historical progress, and that progress is in this world.
  • For a nation originally based on nomadism, where did this Jewish business ethos come from? And can the Hebrews truly be considered as the architects of the values that set the direction of our civilization’s economic thought?
  • Hebrew religiosity is therefore strongly connected with this world, not with any abstract world, and those who take pleasure in worldly possessions are not a priori doing anything wrong.
  • PROGRESS: A SECULARIZED RELIGION One of the things the writers of the Old Testament gave to mankind is the idea and notion of progress. The Old Testament stories have their development; they change the history of the Jewish nation and tie in to each other. The Jewish understanding of time is linear—it has a beginning and an end.
  • The observance of God’s Commandments in Judaism leads not to some ethereal other world, but to an abundance of material goods (Genesis 49:25–26, Leviticus 26:3–13, Deuteronomy 28:1–13) (…) There are no accusing fingers pointed at
  • There are no echoes of asceticism nor for the cleansing and spiritual effect of poverty. It is fitting therefore, that the founders of Judaism, the Patriarchs Abraham, Isaac and Jacob, were all wealthy men.12
  • about due to a linear understanding of history. If history has a beginning as well as an end, and they are not the same point, then exploration suddenly makes sense in areas where the fruits are borne only in the next generation.
  • What’s more, economic progress has almost become an assumption of modern functional societies. We expect growth. We take it automatically. Today, if nothing “new” happens, if GDP does not grow (we say it stagnates) for several quarters, we consider it an anomaly.
  • however, the idea of progress itself underwent major changes, and today we perceive it very differently. As opposed to the original spiritual conceptions, today we perceive progress almost exclusively in an economic or scientific-technological sense.
  • Because care for the soul has today been replaced by care for external things,
  • This is why we must constantly grow, because we (deep down and often implicitly) believe that we are headed toward an (economic) paradise on Earth.
  • Only since the period of scientific-technological revolution (and at a time when economics was born as an independent field) is material progress automatically assumed.
  • Jewish thought is the most grounded, most realistic school of thought of all those that have influenced our culture.17 An abstract world of ideas was unknown to the Jews. To this day it is still forbidden to even depict God, people, and animals in symbols, paintings, statues, and drawings.
  • economists have become key figures of great importance in our time (Kacířské eseje o filosofii dějin [Heretical Essays in the Philosophy of History]). They are expected to perform interpretations of reality, give prophetic services (macroeconomic forecasts), reshape reality (mitigate the impacts of the crisis, speed up growth), and, in the long run, provide leadership on the way to the Promised Land—paradise on Earth.
  • REALISM AND ANTIASCETICISM Aside from ideas of progress, the Hebrews brought another very fundamental contribution to our culture: The desacralization of heroes, nature, and rulers.
  • Voltaire writes: “It certain fact is, that in his public laws he [Moses] never so much as once made mention of a life to come, limiting all punishments and all rewards to the present life.”21
  • As opposed to Christianity, the concept of an extraterrestrial paradise or heaven was not developed much in Hebrew thought.19 The paradise of the Israelites—Eden—was originally placed on Earth at a given place in Mesopotamia20 and at a given time,
  • The Hebrews consider the world to be real—not just a shadow reflection of a better world somewhere in the cloud of ideas, something the usual interpretation of history ascribes to Plato. The soul does not struggle against the body and is not its prisoner, as Augustine would write later.
  • The land, the world, the body, and material reality are for Jews the paramount setting for divine history, the pinnacle of creation. This idea is the conditio sine qua non of the development of economics, something of an utterly earthly making,
  • The mythology of the hero-king was strongly developed in that period, which Claire Lalouette summarizes into these basic characteristics: Beauty (a perfect face, on which it is “pleasant to look upon,” but also “beauty,” expressed in the Egyptian word nefer, not only means aesthetics, but contains moral qualities as well),
  • THE HERO AND HIS UNDEIFICATION: THE DREAM NEVER SLEEPS The concept of the hero is more important than it might appear. It may be the remote origin of Keynes’s animal spirits, or the desire to follow a kind of internal archetype that a given individual accepts as his own and that society values.
  • This internal animator of ours, our internal mover, this dream, never sleeps and it influences our behavior—including economic behavior—more than we want to realize.
  • manliness and strength,28 knowledge and intelligence,29 wisdom and understanding, vigilance and performance, fame and renown (fame which overcomes enemies because “a thousand men would not be able to stand firmly in his presence”);30 the hero is a good shepherd (who takes care of his subordinates), is a copper-clad rampart, the shield of the land, and the defender of heroes.
  • Each of us probably has a sort of “hero within”—a kind of internal role-model, template, an example that we (knowingly or not) follow. It is very important what kind of archetype it is, because its role is dominantly irrational and changes depending on time and the given civilization.
  • The oldest was the so-called Trickster—a fraudster; then the culture bearer—Rabbit; the musclebound hero called Redhorn; and finally the most developed form of hero: the Twins.
  • the Egyptian ruler, just as the Sumerian, was partly a god, or the son of a god.31
  • Jacob defrauds his father Isaac and steals his brother Esau’s blessing of the firstborn. Moses murders an Egyptian. King David seduces the wife of his military commander and then has him killed. In his old age, King Solomon turns to pagan idols, and so on.
  • Anthropology knows several archetypes of heroes. The Polish-born American anthropologist Paul Radin examined the myths of North American Indians and, for example, in his most influential book, The Trickster, he describes their four basic archetypes of heroes.
  • The Torah’s heroes (if that term can be used at all) frequently make mistakes and their mistakes are carefully recorded in the Bible—maybe precisely so that none of them could be deified.32
  • We do not have to go far for examples. Noah gets so drunk he becomes a disgrace; Lot lets his own daughters seduce him in a similar state of drunkenness. Abraham lies and (repeatedly) tries to sell his wife as a concubine.
  • the Hebrew heroes correspond most to the Tricksters, the Culture Bearers, and the Twins. The divine muscleman, that dominant symbol we think of when we say hero, is absent here.
  • To a certain extent it can be said that the Hebrews—and later Christianity—added another archetype, the archetype of the heroic Sufferer.35 Job
  • Undeification, however, does not mean a call to pillage or desecration; man was put here to take care of nature (see the story of the Garden of Eden or the symbolism of the naming of the animals). This protection and care of nature is also related to the idea of progress
  • For the heroes who moved our civilization to where it is today, the heroic archetypes of the cunning trickster, culture bearer, and sufferer are rather more appropriate.
  • the Old Testament strongly emphasizes the undeification of nature.37 Nature is God’s creation, which speaks of divinity but is not the domain of moody gods
  • This is very important for democratic capitalism, because the Jewish heroic archetype lays the groundwork much better for the development of the later phenomenon of the hero, which better suits life as we know it today. “The heroes laid down their arms and set about trading to become wealthy.”
  • in an Old Testament context, the pharaoh was a mere man (whom one could disagree with, and who could be resisted!).
  • RULERS ARE MERE MEN In a similar historical context, the Old Testament teachings carried out a similar desacralization of rulers, the so-called bearers of economic policy.
  • Ultimately the entire idea of a political ruler stood against the Lord’s will, which is explicitly presented in the Torah. The Lord unequivocally preferred the judge as the highest form of rule—an
  • The needs of future generations will have to be considered; after all humankind are the guardians of God’s world. Waste of natural resources, whether privately owned or nationally owned is forbidden.”39
  • Politics lost its character of divine infallibility, and political issues were subject to questioning. Economic policy could become a subject of examination.
  • 44 God first creates with the word and then on individual days He divides light from darkness, water from dry land, day from night, and so forth—and He gives order to things.45 The world is created orderly— it is wisely, reasonably put together. The way of the world is put together at least partially46 decipherably by any other wise and reasonable being who honors rational rules.
  • which for the methodology of science and economics is very important because disorder and chaos are difficult to examine scientifically.43 Faith in some kind of rational and logical order in a system (society, the economy) is a silent assumption of any (economic) examination.
  • THE PRAISE OF ORDER AND WISDOM: MAN AS A PERFECTER OF CREATION The created world has an order of sorts, an order recognizable by us as people,
  • From the very beginning, when God distances Himself from the entire idea, there is an anticipation that there is nothing holy, let alone divine, in politics. Rulers make mistakes, and it is possible to subject them to tough criticism—which frequently occurs indiscriminately through the prophets in the Old Testament.
  • Hebrew culture laid the foundations for the scientific examination of the world.
  • Examining the world is therefore an absolutely legitimate activity, and one that is even requested by God—it is a kind of participation in the Creator’s work.51 Man is called on to understand himself and his surroundings and to use his knowledge for good.
  • I was there when he set heavens in place, when he marked out the horizon on the face of the deep (…) Then I was the craftsman at his side.47
  • There are more urgings to gain wisdom in the Old Testament. “Wisdom calls aloud in the street (…): ‘How long will you simple ones love your simple ways?’”49 Or several chapters later: “Wisdom is supreme; therefore get wisdom. Though it cost all you have, get understanding.”50
  • examination is not forbidden. The fact that order can be grasped by human reason is another unspoken assumption that serves as a cornerstone of any scientific examination.
  • then, my sons, listen to me; blessed are those who keep my ways (…) Blessed is the man who listens to me, watching daily at my doors, waiting at my doorway. For whoever finds me finds life and receives favor from the Lord.
  • the rational examination of nature has its roots, surprisingly, in religion.
  • The Lord brought me forth as the first of his works, before his deeds of old. I was appointed from eternity, from the beginning, before the world began. When there were no oceans, I was given birth, when there were no springs abounding with water, before the mountains were settled in place,
  • The Book of Proverbs emphasizes specifically several times that it was wisdom that was present at the creation of the world. Wisdom personified calls out:
  • The last act, final stroke of the brush of creation, naming of the animals—this act is given to a human, it is not done by God, as one would expect. Man was given the task of completing the act of creation that the Lord began:
  • MAN AS A FINISHER OF CREATION The creation of the world, as it is explained in Jewish teachings, is described in the Book of Genesis. Here God (i) creates, (ii) separates, and (iii) names [my emphasis]:
  • Naming is a symbolic expression. In Jewish culture (and also in our culture to this day), the right to name meant sovereign rights and belonged, for example, to explorers (new places), inventors (new principles), or parents (children)—that is, to those who were there at the genesis, at the origin. This right was handed over by God to mankind.
  • The Naming itself (the capital N is appropriate) traditionally belongs to the crowning act of the Creator and represents a kind of grand finale of creation, the last move of the brush to complete the picture—a signature of the master.
  • Without naming, reality does not exist; it is created together with language. Wittgenstein tightly names this in his tractatus—the limits of our language are the limits of our world.53
  • He invented (fictitiously and completely abstractly!) a framework that was generally accepted and soon “made into” reality. Marx invented similarly; he created the notion of class exploitation. Through his idea, the perception of history and reality was changed for a large part of the world for nearly an entire century.
  • Reality is not a given; it is not passive. Perceiving reality and “facts” requires man’s active participation. It is man who must take the last step, an act (and we
  • How does this relate to economics? Reality itself, our “objective” world, is cocreated, man himself participates in the creation; creation, which is somewhat constantly being re-created.
  • Our scientific models put the finishing touches on reality, because (1) they interpret, (2) they give phenomena a name, (3) they enable us to classify the world and phenomena according to logical forms, and (4) through these models we de facto perceive reality.
  • When man finds a new linguistic framework or analytical model, or stops using the old one, he molds or remolds reality. Models are only in our heads; they are not “in objective reality.” In this sense, Newton invented (not merely discovered!) gravity.
  • A real-ization act on our part represents the creation of a construct, the imputation of sense and order (which is beautifully expressed by the biblical act of naming, or categorization, sorting, ordering).
  • Keynes enters into the history of economic thought from the same intellectual cadence; his greatest contribution to economics was precisely the resurrection of the imperceptible—for example in the form of animal spirits or uncertainty. The economist Piero Mini even ascribes Keynes’s doubting and rebellious approach to his almost Talmudic education.63
  • God connects man with the task of guarding and protecting the Garden of Eden, and thus man actually cocreates the cultural landscape. The Czech philosopher Zdeněk Neubauer also describes this: “Such is reality, and it is so deep that it willingly crystallizes into worlds. Therefore I profess that reality is a creation and not a place of occurrence for objectively given phenomena.”61
  • in this viewpoint it is possible to see how Jewish thought is mystical—it admits the role of the incomprehensible. Therefore, through its groundedness, Jewish thought indulges mystery and defends itself against a mechanistic-causal explanation of the world: “The Jewish way of thinking, according to Veblen, emphasizes the spiritual, the miraculous, the intangible.
  • The Jews believed the exact opposite. The world is created by a good God, and evil appears in it as a result of immoral human acts. Evil, therefore, is induced by man.66 History unwinds according to the morality of human acts.
  • What’s more, history seems to be based on morals; morals seem to be the key determining factors of history. For the Hebrews, history proceeds according to how morally its actors behave.
  • The Sumerians believed in dualism—good and evil deities exist, and the earth of people becomes their passive battlefield.
  • GOOD AND EVIL IN US: A MORAL EXPLANATION OF WELL-BEING We have seen that in the Epic of Gilgamesh, good and evil are not yet addressed systematically on a moral level.
  • This was not about moral-human evil, but rather a kind of natural evil. It is as if good and evil were not touched by morality at all. Evil simply occurred. Period.
  • the epic, good and evil are not envisaged morally—they are not the result of an (a)moral act. Evil was not associated with free moral action or individual will.
  • Hebrew thought, on the other hand, deals intensively with moral good and evil. A moral dimension touches the core of its stories.65
  • discrepancy between savings and investment, and others are convinced of the monetary essence
  • The entire history of the Jewish nation is interpreted and perceived in terms of morality. Morality has become, so to speak, a mover and shaker of Hebrew history.
  • sunspots. The Hebrews came up with the idea that morals were behind good and bad years, behind the economic cycle. But we would be getting ahead of ourselves. Pharaoh’s Dream: Joseph and the First Business Cycle To
  • It is the Pharaoh’s well-known dream of seven fat and seven lean cows, which he told to Joseph, the son of Jacob. Joseph interpreted the dream as a macroeconomic prediction of sorts: Seven years of abundance were to be followed by seven years of poverty, famine, and misery.
  • Self-Contradicting Prophecy Here, let’s make several observations on this: Through taxation74 on the level of one-fifth of a crop75 in good years to save the crop and then open granaries in bad years, the prophecy was de facto prevented (prosperous years were limited and hunger averted—through a predecessor of fiscal stabilization).
  • The Old Testament prophesies therefore were not any deterministic look into the future, but warnings and strategic variations of the possible, which demanded some kind of reaction. If the reaction was adequate, what was prophesied would frequently not occur at all.
  • This principle stands directly against the self-fulfilling prophecy,80 the well-known concept of social science. Certain prophecies become self-fulfilling when expressed (and believed) while others become self-contradicting prophecies when pronounced (and believed).
  • If the threat is anticipated, it is possible to totally or at least partially avoid it. Neither Joseph nor the pharaoh had the power to avoid bounty or crop failure (in this the dream interpretation was true and the appearance of the future mystical), but they avoided the impacts and implications of the prophecy (in this the interpretation of the dream was “false”)—famine did not ultimately occur in Egypt, and this was due to the application of reasonable and very intuitive economic policy.
  • Let us further note that the first “macroeconomic forecast” appears in a dream.
  • back to Torah: Later in this story we will notice that there is no reason offered as to why the cycle occurs (that will come later). Fat years will simply come, and then lean years after them.
  • Moral Explanation of a Business Cycle That is fundamentally different from later Hebrew interpretations, when the Jewish nation tries to offer reasons why the nation fared well or poorly. And those reasons are moral.
  • If you pay attention to these laws and are careful to follow them, then the Lord your God will keep his covenant of love with you, as he swore to your forefathers. He will love you and bless you and increase your numbers.
  • Only in recent times have some currents of economics again become aware of the importance of morals and trust in the form of measuring the quality of institutions, the level of justice, business ethics, corruption, and so forth, and examining their influence on the economy,
  • From today’s perspective, we can state that the moral dimension entirely disappeared from economic thought for a long time, especially due to the implementation of Mandeville’s concept of private vices that contrarily support the public welfare
  • Without being timid, we can say this is the first documented attempt to explain the economic cycle. The economic cycle, the explanation of which is to this day a mystery to economists, is explained morally in the Old Testament.
  • But how do we consolidate these two conflicting interpretations of the economic cycle: Can ethics be responsible for it or not? Can we influence reality around us through our acts?
  • it is not within the scope of this book to answer that question; justice has been done to the question if it manages to sketch out the main contours of possible searches for answers.
  • THE ECONOMICS OF GOOD AND EVIL: DOES GOOD PAY OFF? This is probably the most difficult moral problem we could ask.
  • Kant, the most important modern thinker in the area of ethics, answers on the contrary that if we carry out a “moral” act on the basis of economic calculus (therefore we carry out an hedonistic consideration; see below) in the expectation of later recompense, its morality is lost. Recompense, according to the strict Kant, annuls ethics.
  • Inquiring about the economics of good and evil, however, is not that easy. Where would Kant’s “moral dimension of ethics” go if ethics paid? If we do good for profit, the question of ethics becomes a mere question of rationality.
  • Job’s friends try to show that he must have sinned in some way and, in doing so, deserved God’s punishment. They are absolutely unable to imagine a situation in which Job, as a righteous man, would suffer without (moral) cause. Nevertheless, Job insists that he deserves no punishment because he has committed no offense: “God has wronged me and drawn his net around me.”94
  • But Job remains righteous, even though it does not pay to do so: Though he slay me, yet will I hope in him.95 And till I die, I will not deny my integrity I will maintain my righteousness and never let go of it; my conscience will not reproach me as long as I live.96
  • He remains righteous, even if his only reward is death. What economic advantage could he have from that?
  • morals cannot be considered in the economic dimension of productivity and calculus. The role of the Hebrews was to do good, whether it paid off or not. If good (outgoing) is rewarded by incoming goodness, it is a bonus,99 not a reason to do outgoing good. Good and reward do not correlate to each other.
  • This reasoning takes on a dimension of its own in the Old Testament. Good (incoming) has already happened to us. We must do good (outgoing) out of gratitude for the good (incoming) shown to us in the past.
  • So why do good? After all, suffering is the fate of many biblical figures. The answer can only be: For good itself. Good has the power to be its own reward. In this sense, goodness gets its reward, which may or may not take on a material dimension.
  • the Hebrews offered an interesting compromise between the teachings of the Stoics and Epicureans. We will go into it in detail later, so only briefly
  • constraint. It calls for bounded optimalization (with limits). A kind of symbiosis existed between the legitimate search for one’s own utility (or enjoyment of life) and maintaining rules, which are not negotiable and which are not subject to optimalization.
  • In other words, clear (exogenously given) rules exist that must be observed and cannot be contravened. But within these borders it is absolutely possible, and even recommended, to increase utility.
  • the mining of enjoyment must not come at the expense of exogenously given rules. “Judaism comes therefore to train or educate the unbounded desire … for wealth, so that market activities and patterns of consumption operate within a God-given morality.”102
  • The Epicureans acted with the goal of maximizing utility without regard for rules (rules developed endogenously, from within the system, computed from that which increased utility—this was one of the main trumps of the Epicurean school; they did not need exogenously given norms, and argued that they could “calculate” ethics (what to do) for every given situation from the situation itself).
  • The Stoics could not seek their enjoyment—or, by another name, utility. They could not in any way look back on it, and in no way could they count on it. They could only live according to rules (the greatest weakness of this school was to defend where exogenously the given rules came from and whether they are universal) and take a indifferent stand to the results of their actions.
  • To Love the Law The Jews not only had to observe the law (perhaps the word covenant would be more appropriate), but they were to love it because it was good.
  • Their relationship to the law was not supposed to be one of duty,105 but one of gratitude, love. Hebrews were to do good (outgoing), because goodness (incoming) has already been done to them.
  • This is in stark contrast with today’s legal system, where, naturally, no mention of love or gratefulness exists. But God expects a full internalization of the commandments and their fulfillment with love, not as much duty. By no means was this on the basis of the cost-benefit analyses so widespread in economics today, which determines when it pays to break the law and when not to (calculated on the basis of probability of being caught and the amount of punishment vis-à-vis the possible gain).
  • And now, O Israel, what does the Lord your God ask of you but to fear the Lord your God, to walk in all his ways, to love him, to serve the Lord your God with all your heart and with all your soul, and to observe the Lord’s commands and decrees that I am giving you today for your own good? To the Lord your God belong the heavens, even the highest heavens, the earth and everything in it. Yet the Lord set his affection on your forefathers and loved them….
  • the principle of doing good (outgoing) on the basis of a priori demonstrated good (incoming) was also taken over by the New Testament. Atonement itself is based on an a priori principle; all our acts are preceded by good.
  • The Hebrews, originally a nomadic tribe, preferred to be unrestrained and grew up in constant freedom of motion.
  • Human laws, if they are in conflict with the responsibilities given by God, are subordinate to personal responsibility, and a Jew cannot simply join the majority, even if it is legally allowed. Ethics, the concept of good, is therefore always superior to all local laws, rules, and customs:
  • THE SHACKLES OF THE CITY Owing to the Hebrew’s liberation from Egyptian slavery, freedom and responsibility become the key values of Jewish thought.
  • Laws given by God are binding for Jews, and God is the absolute source of all values,
  • The Hebrew ideal is represented by the paradise of the Garden of Eden, not a city.116 The despised city civilization or the tendency to see in it a sinful and shackling way of life appears in glimpses and allusions in many places in the Old Testament.
  • The nomadic Jewish ethos is frequently derived from Abraham, who left the Chaldean city of Ur on the basis of a command:
  • In addition, they were aware of a thin two-way line between owner and owned. We own material assets, but—to a certain extent—they own us and tie us down. Once we become used to a certain material
  • This way of life had understandably immense economic impacts. First, such a society lived in much more connected relationships, where there was no doubt that everyone mutually depended on each other. Second, their frequent wanderings meant the inability to own more than they could carry; the gathering up of material assets did not have great weight—precisely because the physical weight (mass) of things was tied to one place.
  • One of Moses’s greatest deeds was that he managed to explain to his nation once and for all that it is better to remain hungry and liberated than to be a slave with food “at no cost.”
  • SOCIAL WELFARE: NOT TO ACT IN THE MANNER OF SODOM
  • regulations is developed in the Old Testament, one we hardly find in any other nation of the time. In Hebrew teachings, aside from individual utility, indications of the concept of maximalizing utility societywide appear for the first time as embodied in the Talmudic principle of Kofin al midat S´dom, which can be translated as “one is compelled not to act in the manner of Sodom” and to take care of the weaker members of society.
  • In a jubilee year, debts were to be forgiven,125 and Israelites who fell into slavery due to their indebtedness were to be set free.126
  • Such provisions can be seen as the antimonopoly and social measures of the time. The economic system even then had a clear tendency to converge toward asset concentration, and therefore power as well. It would appear that these provisions were supposed to prevent this process
  • Land at the time could be “sold,” and it was not sale, but rent. The price (rent) of real estate depended on how long there was until a forgiveness year. It was about the awareness that we may work the land, but in the last instance we are merely “aliens and strangers,” who have the land only rented to us for a fixed time. All land and riches came from the Lord.
  • These provisions express a conviction that freedom and inheritance should not be permanently taken away from any Israelite. Last but not least, this system reminds us that no ownership lasts forever and that the fields we plow are not ours but the Lord’s.
  • Glean Another social provision was the right to glean, which in Old Testament times ensured at least basic sustenance for the poorest. Anyone who owned a field had the responsibility not to harvest it to the last grain but to leave the remains in the field for the poor.
  • Tithes and Early Social Net Every Israelite also had the responsibility of levying a tithe from their entire crop. They had to be aware from whom all ownership comes and, by doing so, express their thanks.
  • “Since the community has an obligation to provide food, shelter, and basic economic goods for the needy, it has a moral right and duty to tax its members for this purpose. In line with this duty, it may have to regulate markets, prices and competition, to protect the interests of its weakest members.”135
  • In Judaism, charity is not perceived as a sign of goodness; it is more of a responsibility. Such a society then has the right to regulate its economy in such a way that the responsibility of charity is carried out to its satisfaction.
  • With a number of responsibilities, however, comes the difficulty of getting them into practice. Their fulfillment, then, in cases when it can be done, takes place gradually “in layers.” Charitable activities are classified in the Talmud according to several target groups with various priorities, classified according to, it could be said, rules of subsidiarity.
  • Do not mistreat an alien or oppress him, for you were aliens in Egypt.140 As one can see, aside from widows and orphans, the Old Testament also includes immigrants in its area of social protection.141 The Israelites had to have the same rules apply for them as for themselves—they could not discriminate on the basis of their origin.
  • ABSTRACT MONEY, FORBIDDEN INTEREST, AND OUR DEBT AGE If it appears to us that today’s era is based on money and debt, and our time will be written into history as the “Debt age,” then it will certainly be interesting to follow how this development occurred.
  • Money is a social abstractum. It is a social agreement, an unwritten contract.
  • The first money came in the form of clay tablets from Mesopotamia, on which debts were written. These debts were transferable, so the debts became currency. In the end, “It is no coincidence that in English the root of ‘credit’ is ‘credo,’ the Latin for ‘I believe.’”
  • To a certain extent it could be said that credit, or trust, was the first currency. It can materialize, it can be embodied in coins, but what is certain is that “money is not metal,” even the rarest metal, “it is trust inscribed,”
  • Inseparably, with the original credit (money) goes interest. For the Hebrews, the problem of interest was a social issue: “If you lend money to one of my people among you who is needy, do not be like a moneylender; charge him no interest.”
  • there were also clearly set rules setting how far one could go in setting guarantees and the nonpayment of debts. No one should become indebted to the extent that they could lose the source of their livelihood:
  • In the end, the term “bank” comes from the Italian banci, or the benches that Jewish lenders sat on.157
  • Money is playing not only its classical roles (as a means of exchange, a holder of value, etc.) but also a much greater, stronger role: It can stimulate, drive (or slow down) the whole economy. Money plays a national economic role.
  • In the course of history, however, the role of loans changed, and the rich borrowed especially for investment purposes,
  • Today the position and significance of money and debt has gone so far and reached such a dominant position in society that operating with debts (fiscal policy) or interest or money supply (monetary policy) means that these can, to a certain extent, direct (or at least strongly influence) the whole economy and society.
  • In such a case a ban on interest did not have great ethical significance. Thomas Aquinas, a medieval scholar (1225-1274), also considers similarly; in his time, the strict ban on lending with usurious interest was loosened, possibly due to him.
  • As a form of energy, money can travel in three dimensions, vertically (those who have capital lend to those who do not) and horizontally (speed and freedom in horizontal or geographic motion has become the by-product—or driving force?—of globalization). But money (as opposed to people) can also travel through time.
  • money is something like energy that can travel through time. And it is a very useful energy, but at the same time very dangerous as well. Wherever
  • Aristotle condemned interest162 not only from a moral standpoint, but also for metaphysical reasons. Thomas Aquinas shared the same fear of interest and he too argued that time does not belong to us, and that is why we must not require interest.
  • MONEY AS ENERGY: TIME TRAVEL AND GROSS DEBT PRODUCT (GDP)
  • Due to this characteristic, we can energy-strip the future to the benefit of the present. Debt can transfer energy from the future to the present.163 On the other hand, saving can accumulate energy from the past and send it to the present.
  • labor was not considered degrading in the Old Testament. On the contrary, the subjugation of nature is even a mission from God that originally belonged to man’s very first blessings.
  • LABOR AND REST: THE SABBATH ECONOMY
  • The Jews as well as Aristotle behaved very guardedly toward loans. The issue of interest/usury became one of the first economic debates. Without having an inkling of the future role of economic policy (fiscal and monetary), the ancient Hebrews may have unwittingly felt that they were discovering in interest a very powerful weapon, one that can be a good servant, but (literally) an enslaving master as well.
  • It’s something like a dam. When we build one, we are preventing periods of drought and flooding in the valley; we are limiting nature’s whims and, to a large extent, avoiding its incalculable cycles. Using dams, we can regulate the flow of water to nearly a constant. With it we tame the river (and we can also gain
  • But if we do not regulate the water wisely, it may happen that we would overfill the dam and it would break. For the cities lying in the valley, their end would be worse than if a dam were never there.
  • If man lived in harmony with nature before, now, after the fall, he must fight; nature stands against him and he against it and the animals. From the Garden we have moved unto a (battle)field.
  • Only after man’s fall does labor turn into a curse.168 It could even be said that this is actually the only curse, the curse of the unpleasantness of labor, that the Lord places on Adam.
  • Both Plato and Aristotle consider labor to be necessary for survival, but that only the lower classes should devote themselves to it so that the elites would not have to be bothered with it and so that they could devote themselves to “purely spiritual matters—art, philosophy, and politics.”
  • Work is also not only a source of pleasure but a social standing; It is considered an honor. “Do you see a man skilled in his work? He will serve before kings.”170 None of the surrounding cultures appreciate work as much. The idea of the dignity of labor is unique in the Hebrew tradition.
  • Hebrew thinking is characterized by a strict separation of the sacred from the profane. In life, there are simply areas that are holy, and in which it is not allowed to economize, rationalize, or maximize efficiency.
  • good example is the commandment on the Sabbath. No one at all could work on this day, not even the ones who were subordinate to an observant Jew:
  • the message of the commandment on Saturday communicated that people were not primarily created for labor.
  • Paradoxically, it is precisely this commandment out of all ten that is probably the most violated today.
  • Aristotle even considers labor to be “a corrupted waste of time which only burdens people’s path to true honour.”
  • we have days when we must not toil connected (at least lexically) with the word meaning emptiness: the English term “vacation” (or emptying), as with the French term, les vacances, or German die Freizeit, meaning open time, free time, but also…
  • Translated into economic language: The meaning of utility is not to increase it permanently but to rest among existing gains. Why do we learn how to constantly increase gains but not how to…
  • This dimension has disappeared from today’s economics. Economic effort has no goal at which it would be possible to rest. Today we only know growth for growth’s sake, and if our company or country prospers, that does not…
  • Six-sevenths of time either be dissatisfied and reshape the world into your own image, man, but one-seventh you will rest and not change the creation. On the seventh day, enjoy creation and enjoy the work of your hands.
  • the purpose of creation was not just creating but that it had an end, a goal. The process was just a process, not a purpose. The whole of Being was created so…
  • Saturday was not established to increase efficiency. It was a real ontological break that followed the example of the Lord’s seventh day of creation. Just as the Lord did not rest due to tiredness or to regenerate strength; but because He was done. He was done with His work, so that He could enjoy it, to cherish in His creation.
  • If we believe in rest at all today, it is for different reasons. It is the rest of the exhausted machine, the rest of the weak, and the rest of those who can’t handle the tempo. It’s no wonder that the word “rest…
  • Related to this, we have studied the first mention of a business cycle with the pharaoh’s dream as well as seen a first attempt (that we may call…
  • We have tried to show that the quest for a heaven on Earth (similar to the Jewish one) has, in its desacralized form, actually also been the same quest for many of the…
  • We have also seen that the Hebrews tried to explain the business cycle with morality and ethics. For the Hebrews,…
  • ancient Greek economic ethos, we will examine two extreme approaches to laws and rules. While the Stoics considered laws to be absolutely valid, and utility had infinitesimal meaning in their philosophy, the Epicureans, at least in the usual historical explanation, placed utility and pleasure in first place—rules were to be made based on the principle of utility.
  • CONCLUSION: BETWEEN UTILITY AND PRINCIPLE The influence of Jewish thought on the development of market democracy cannot be overestimated. The key heritage for us was the lack of ascetic perception of the world, respect to law and private…
  • We have tried to show how the Torah desacralized three important areas in our lives: the earthly ruler, nature,…
  • What is the relationship between the good and evil that we do (outgoing) and the utility of disutility that we (expect to) get as a reward (incoming)? We have seen…
  • The Hebrews never despised material wealth; on contrary, the Jewish faith puts great responsibility on property management. Also the idea of progress and the linear perception of time gives our (economic)…
  • the Hebrews managed to find something of a happy compromise between both of these principles.
  • will not be able to completely understand the development of the modern notion of economics without understanding the disputes between the Epicureans and the Stoics;
  • poets actually went even further, and with their speech they shaped and established reality and truth. Honor, adventure, great deeds, and the acclaim connected with them played an important role in the establishment of the true, the real.
  • those who are famous will be remembered by people. They become more real, part of the story, and they start to be “realized,” “made real” in the lives of other people. That which is stored in memory is real; that which is forgotten is as if it never existed.
  • Today’s scientific truth is founded on the notion of exact and objective facts, but poetic truth stands on an interior (emotional) consonance with the story or poem. “It is not addressed first to the brain … [myth] talks directly to the feeling system.”
  • “epic and tragic poets were widely assumed to be the central ethical thinkers and teachers of Greece; nobody thought of their work as less serious, less aimed at truth, than the speculative prose treatises of historians and philosophers.”5 Truth and reality were hidden in speech, stories, and narration.
  • Ancient philosophy, just as science would later, tries to find constancy, constants, quantities, inalterabilities. Science seeks (creates?) order and neglects everything else as much as it can. In their own experiences, everyone knows that life is not like that,
  • Just as scientists do today, artists drew images of the world that were representative, and therefore symbolic, picturelike, and simplifying (but thus also misleading), just like scientific models, which often do not strive to be “realistic.”
  • general? In the end, poetry could be more sensitive to the truth than the philosophical method or, later, the scientific method. “Tragic poems, in virtue of their subject matter and their social function, are likely to confront and explore problems about human beings and luck that a philosophical text might be able to omit or avoid.”8
Javier E

Whistleblower: Twitter misled investors, FTC and underplayed spam issues - Washington Post - 0 views

  • Twitter executives deceived federal regulators and the company’s own board of directors about “extreme, egregious deficiencies” in its defenses against hackers, as well as its meager efforts to fight spam, according to an explosive whistleblower complaint from its former security chief.
  • The complaint from former head of security Peiter Zatko, a widely admired hacker known as “Mudge,” depicts Twitter as a chaotic and rudderless company beset by infighting, unable to properly protect its 238 million daily users including government agencies, heads of state and other influential public figures.
  • Among the most serious accusations in the complaint, a copy of which was obtained by The Washington Post, is that Twitter violated the terms of an 11-year-old settlement with the Federal Trade Commission by falsely claiming that it had a solid security plan. Zatko’s complaint alleges he had warned colleagues that half the company’s servers were running out-of-date and vulnerable software and that executives withheld dire facts about the number of breaches and lack of protection for user data, instead presenting directors with rosy charts measuring unimportant changes.
  • ...56 more annotations...
  • “Security and privacy have long been top companywide priorities at Twitter,” said Twitter spokeswoman Rebecca Hahn. She said that Zatko’s allegations appeared to be “riddled with inaccuracies” and that Zatko “now appears to be opportunistically seeking to inflict harm on Twitter, its customers, and its shareholders.” Hahn said that Twitter fired Zatko after 15 months “for poor performance and leadership.” Attorneys for Zatko confirmed he was fired but denied it was for performance or leadership.
  • the whistleblower document alleges the company prioritized user growth over reducing spam, though unwanted content made the user experience worse. Executives stood to win individual bonuses of as much as $10 million tied to increases in daily users, the complaint asserts, and nothing explicitly for cutting spam.
  • Chief executive Parag Agrawal was “lying” when he tweeted in May that the company was “strongly incentivized to detect and remove as much spam as we possibly can,” the complaint alleges.
  • Zatko described his decision to go public as an extension of his previous work exposing flaws in specific pieces of software and broader systemic failings in cybersecurity. He was hired at Twitter by former CEO Jack Dorsey in late 2020 after a major hack of the company’s systems.
  • “I felt ethically bound. This is not a light step to take,” said Zatko, who was fired by Agrawal in January. He declined to discuss what happened at Twitter, except to stand by the formal complaint. Under SEC whistleblower rules, he is entitled to legal protection against retaliation, as well as potential monetary rewards.
  • A person familiar with Zatko’s tenure said the company investigated Zatko’s security claims during his time there and concluded they were sensationalistic and without merit. Four people familiar with Twitter’s efforts to fight spam said the company deploys extensive manual and automated tools to both measure the extent of spam across the service and reduce it.
  • In 1998, Zatko had testified to Congress that the internet was so fragile that he and others could take it down with a half-hour of concentrated effort. He later served as the head of cyber grants at the Defense Advanced Research Projects Agency, the Pentagon innovation unit that had backed the internet’s invention.
  • Overall, Zatko wrote in a February analysis for the company attached as an exhibit to the SEC complaint, “Twitter is grossly negligent in several areas of information security. If these problems are not corrected, regulators, media and users of the platform will be shocked when they inevitably learn about Twitter’s severe lack of security basics.”
  • Zatko’s complaint says strong security should have been much more important to Twitter, which holds vast amounts of sensitive personal data about users. Twitter has the email addresses and phone numbers of many public figures, as well as dissidents who communicate over the service at great personal risk.
  • This month, an ex-Twitter employee was convicted of using his position at the company to spy on Saudi dissidents and government critics, passing their information to a close aide of Crown Prince Mohammed bin Salman in exchange for cash and gifts.
  • Zatko’s complaint says he believed the Indian government had forced Twitter to put one of its agents on the payroll, with access to user data at a time of intense protests in the country. The complaint said supporting information for that claim has gone to the National Security Division of the Justice Department and the Senate Select Committee on Intelligence. Another person familiar with the matter agreed that the employee was probably an agent.
  • “Take a tech platform that collects massive amounts of user data, combine it with what appears to be an incredibly weak security infrastructure and infuse it with foreign state actors with an agenda, and you’ve got a recipe for disaster,” Charles E. Grassley (R-Iowa), the top Republican on the Senate Judiciary Committee,
  • Many government leaders and other trusted voices use Twitter to spread important messages quickly, so a hijacked account could drive panic or violence. In 2013, a captured Associated Press handle falsely tweeted about explosions at the White House, sending the Dow Jones industrial average briefly plunging more than 140 points.
  • After a teenager managed to hijack the verified accounts of Obama, then-candidate Joe Biden, Musk and others in 2020, Twitter’s chief executive at the time, Jack Dorsey, asked Zatko to join him, saying that he could help the world by fixing Twitter’s security and improving the public conversation, Zatko asserts in the complaint.
  • The complaint — filed last month with the Securities and Exchange Commission and the Department of Justice, as well as the FTC — says thousands of employees still had wide-ranging and poorly tracked internal access to core company software, a situation that for years had led to embarrassing hacks, including the commandeering of accounts held by such high-profile users as Elon Musk and former presidents Barack Obama and Donald Trump.
  • But at Twitter Zatko encountered problems more widespread than he realized and leadership that didn’t act on his concerns, according to the complaint.
  • Twitter’s difficulties with weak security stretches back more than a decade before Zatko’s arrival at the company in November 2020. In a pair of 2009 incidents, hackers gained administrative control of the social network, allowing them to reset passwords and access user data. In the first, beginning around January of that year, hackers sent tweets from the accounts of high-profile users, including Fox News and Obama.
  • Several months later, a hacker was able to guess an employee’s administrative password after gaining access to similar passwords in their personal email account. That hacker was able to reset at least one user’s password and obtain private information about any Twitter user.
  • Twitter continued to suffer high-profile hacks and security violations, including in 2017, when a contract worker briefly took over Trump’s account, and in the 2020 hack, in which a Florida teen tricked Twitter employees and won access to verified accounts. Twitter then said it put additional safeguards in place.
  • This year, the Justice Department accused Twitter of asking users for their phone numbers in the name of increased security, then using the numbers for marketing. Twitter agreed to pay a $150 million fine for allegedly breaking the 2011 order, which barred the company from making misrepresentations about the security of personal data.
  • After Zatko joined the company, he found it had made little progress since the 2011 settlement, the complaint says. The complaint alleges that he was able to reduce the backlog of safety cases, including harassment and threats, from 1 million to 200,000, add staff and push to measure results.
  • But Zatko saw major gaps in what the company was doing to satisfy its obligations to the FTC, according to the complaint. In Zatko’s interpretation, according to the complaint, the 2011 order required Twitter to implement a Software Development Life Cycle program, a standard process for making sure new code is free of dangerous bugs. The complaint alleges that other employees had been telling the board and the FTC that they were making progress in rolling out that program to Twitter’s systems. But Zatko alleges that he discovered that it had been sent to only a tenth of the company’s projects, and even then treated as optional.
  • “If all of that is true, I don’t think there’s any doubt that there are order violations,” Vladeck, who is now a Georgetown Law professor, said in an interview. “It is possible that the kinds of problems that Twitter faced eleven years ago are still running through the company.”
  • “Agrawal’s Tweets and Twitter’s previous blog posts misleadingly imply that Twitter employs proactive, sophisticated systems to measure and block spam bots,” the complaint says. “The reality: mostly outdated, unmonitored, simple scripts plus overworked, inefficient, understaffed, and reactive human teams.”
  • One current and one former employee recalled that incident, when failures at two Twitter data centers drove concerns that the service could have collapsed for an extended period. “I wondered if the company would exist in a few days,” one of them said.
  • The current and former employees also agreed with the complaint’s assertion that past reports to various privacy regulators were “misleading at best.”
  • For example, they said the company implied that it had destroyed all data on users who asked, but the material had spread so widely inside Twitter’s networks, it was impossible to know for sure
  • As the head of security, Zatko says he also was in charge of a division that investigated users’ complaints about accounts, which meant that he oversaw the removal of some bots, according to the complaint. Spam bots — computer programs that tweet automatically — have long vexed Twitter. Unlike its social media counterparts, Twitter allows users to program bots to be used on its service: For example, the Twitter account @big_ben_clock is programmed to tweet “Bong Bong Bong” every hour in time with Big Ben in London. Twitter also allows people to create accounts without using their real identities, making it harder for the company to distinguish between authentic, duplicate and automated accounts.
  • In the complaint, Zatko alleges he could not get a straight answer when he sought what he viewed as an important data point: the prevalence of spam and bots across all of Twitter, not just among monetizable users.
  • Zatko cites a “sensitive source” who said Twitter was afraid to determine that number because it “would harm the image and valuation of the company.” He says the company’s tools for detecting spam are far less robust than implied in various statements.
  • The complaint also alleges that Zatko warned the board early in his tenure that overlapping outages in the company’s data centers could leave it unable to correctly restart its servers. That could have left the service down for months, or even have caused all of its data to be lost. That came close to happening in 2021, when an “impending catastrophic” crisis threatened the platform’s survival before engineers were able to save the day, the complaint says, without providing further details.
  • The four people familiar with Twitter’s spam and bot efforts said the engineering and integrity teams run software that samples thousands of tweets per day, and 100 accounts are sampled manually.
  • Some employees charged with executing the fight agreed that they had been short of staff. One said top executives showed “apathy” toward the issue.
  • Zatko’s complaint likewise depicts leadership dysfunction, starting with the CEO. Dorsey was largely absent during the pandemic, which made it hard for Zatko to get rulings on who should be in charge of what in areas of overlap and easier for rival executives to avoid collaborating, three current and former employees said.
  • For example, Zatko would encounter disinformation as part of his mandate to handle complaints, according to the complaint. To that end, he commissioned an outside report that found one of the disinformation teams had unfilled positions, yawning language deficiencies, and a lack of technical tools or the engineers to craft them. The authors said Twitter had no effective means of dealing with consistent spreaders of falsehoods.
  • Dorsey made little effort to integrate Zatko at the company, according to the three employees as well as two others familiar with the process who spoke on the condition of anonymity to describe sensitive dynamics. In 12 months, Zatko could manage only six one-on-one calls, all less than 30 minutes, with his direct boss Dorsey, who also served as CEO of payments company Square, now known as Block, according to the complaint. Zatko allegedly did almost all of the talking, and Dorsey said perhaps 50 words in the entire year to him. “A couple dozen text messages” rounded out their electronic communication, the complaint alleges.
  • Faced with such inertia, Zatko asserts that he was unable to solve some of the most serious issues, according to the complaint.
  • Some 30 percent of company laptops blocked automatic software updates carrying security fixes, and thousands of laptops had complete copies of Twitter’s source code, making them a rich target for hackers, it alleges.
  • A successful hacker takeover of one of those machines would have been able to sabotage the product with relative ease, because the engineers pushed out changes without being forced to test them first in a simulated environment, current and former employees said.
  • “It’s near-incredible that for something of that scale there would not be a development test environment separate from production and there would not be a more controlled source-code management process,” said Tony Sager, former chief operating officer at the cyberdefense wing of the National Security Agency, the Information Assurance divisio
  • Sager is currently senior vice president at the nonprofit Center for Internet Security, where he leads a consensus effort to establish best security practices.
  • The complaint says that about half of Twitter’s roughly 7,000 full-time employees had wide access to the company’s internal software and that access was not closely monitored, giving them the ability to tap into sensitive data and alter how the service worked. Three current and former employees agreed that these were issues.
  • “A best practice is that you should only be authorized to see and access what you need to do your job, and nothing else,” said former U.S. chief information security officer Gregory Touhill. “If half the company has access to and can make configuration changes to the production environment, that exposes the company and its customers to significant risk.”
  • The complaint says Dorsey never encouraged anyone to mislead the board about the shortcomings, but that others deliberately left out bad news.
  • When Dorsey left in November 2021, a difficult situation worsened under Agrawal, who had been responsible for security decisions as chief technology officer before Zatko’s hiring, the complaint says.
  • An unnamed executive had prepared a presentation for the new CEO’s first full board meeting, according to the complaint. Zatko’s complaint calls the presentation deeply misleading.
  • The presentation showed that 92 percent of employee computers had security software installed — without mentioning that those installations determined that a third of the machines were insecure, according to the complaint.
  • Another graphic implied a downward trend in the number of people with overly broad access, based on the small subset of people who had access to the highest administrative powers, known internally as “God mode.” That number was in the hundreds. But the number of people with broad access to core systems, which Zatko had called out as a big problem after joining, had actually grown slightly and remained in the thousands.
  • The presentation included only a subset of serious intrusions or other security incidents, from a total Zatko estimated as one per week, and it said that the uncontrolled internal access to core systems was responsible for just 7 percent of incidents, when Zatko calculated the real proportion as 60 percent.
  • Zatko stopped the material from being presented at the Dec. 9, 2021 meeting, the complaint said. But over his continued objections, Agrawal let it go to the board’s smaller Risk Committee a week later.
  • Agrawal didn’t respond to requests for comment. In an email to employees after publication of this article, obtained by The Post, he said that privacy and security continues to be a top priority for the company, and he added that the narrative is “riddled with inconsistences” and “presented without important context.”
  • On Jan. 4, Zatko reported internally that the Risk Committee meeting might have been fraudulent, which triggered an Audit Committee investigation.
  • Agarwal fired him two weeks later. But Zatko complied with the company’s request to spell out his concerns in writing, even without access to his work email and documents, according to the complaint.
  • Since Zatko’s departure, Twitter has plunged further into chaos with Musk’s takeover, which the two parties agreed to in May. The stock price has fallen, many employees have quit, and Agrawal has dismissed executives and frozen big projects.
  • Zatko said he hoped that by bringing new scrutiny and accountability, he could improve the company from the outside.
  • “I still believe that this is a tremendous platform, and there is huge value and huge risk, and I hope that looking back at this, the world will be a better place, in part because of this.”
Javier E

Supreme Court Case on Public Sector Union Fees Rouses Political Suspicions - The New Yo... - 0 views

  • Taking a page from the liberal playbook, Mr. Horowitz and others recommended that conservative donors support groups similar in their ambition and structure to public interest organizations, like the American Civil Liberties Union and the NAACP Legal Defense Fund, that had enjoyed great success in the 1960s and 1970s by actively looking for clients with potentially precedent-setting cases, then pouring resources into litigating them.
  • The Center for Individual Rights became one of the earliest public interest groups to grow out of this reassessment, focusing initially on defending academic free speech amid what it considered to be overweening political correctness. It began to attack affirmative action policies a few years late
  • That level of strategic savvy foreshadowed the more recent efforts of conservative political and policy groups.
  • ...6 more annotations...
  • “The A.C.L.U. in our view was a great organization for a long time — it defended individual rights across the board without regard to the content of the views expressed,” Mr. Pell said in an interview. “We thought that there was room for a public interest law firm to pick up the original A.C.L.U. mission and amplify it.”
  • The Center for Individual Rights is embedded in the world of prominent conservative political donors as well, having received large contributions from the Sarah Scaife Foundation, the John M. Olin Foundation, and the Lynde and Harry Bradley Foundation, according to filings with the Internal Revenue Service.
  • Many of the center’s donors contribute to other groups that have been active in trying to curtail union activity. The Bradley Foundation’s president, Michael Grebe, has been one of the most important supporters for Gov. Scott Walker of Wisconsin.
  • Mr. Grebe said his organization has had an interest in challenging public employee unions for about 15 years, supporting groups that do so on the policy level. He said support for legal action in the same area was “a natural extension” of these efforts.
  • it is difficult to find evidence of a single coordinating body that has directed money toward the Center for Individual Rights and its legal campaign to allow public employees to opt out of union fees.
  • Mr. Piereson, in an interview, acknowledged that there was both considerable suspicion among conservatives toward public employee unions and frequent communication among donors and employees of organizations on the right about efforts to rein in these unions. But, he said, this did not amount to a conspiracy. He called the coordination “diffuse, decentralized.”
Javier E

The Wages of Guilt: Memories of War in Germany and Japan (Ian Buruma) - 0 views

  • the main reason why Germans were more trusted by their neighbors was that they were learning, slowly and painfully, and not always fully, to trust themselves.
  • elders, in government and the mass media, still voice opinions about the Japanese war that are unsettling, to say the least. Conservative politicians still pay their annual respects at a shrine where war criminals are officially remembered. Justifications and denials of war crimes are still heard. Too many Japanese in conspicuous places, including the prime minister’s office itself, have clearly not “coped” with the war.
  • unlike Nazi Germany, Japan had no systematic program to destroy the life of every man, woman, and child of a people that, for ideological reasons, was deemed to have no right to exist.
  • ...297 more annotations...
  • “We never knew,” a common reaction in the 1950s, had worn shamefully thin in the eyes of a younger generation by the 1960s. The extraordinary criminality of a deliberate genocide was so obvious that it left no room for argument.
  • Right-wing nationalists like to cite the absence of a Japanese Holocaust as proof that Japanese have no reason to feel remorse about their war at all. It was, in their eyes, a war like any other; brutal, yes, just as wars fought by all great nations in history have been brutal. In fact, since the Pacific War was fought against Western imperialists, it was a justified—even noble—war of Asian liberation.
  • in the late 1940s or 1950s, a time when most Germans were still trying hard not to remember. It is in fact extraordinary how honestly Japanese novelists and filmmakers dealt with the horrors of militarism in those early postwar years. Such honesty is much less evident now.
  • Popular comic books, aimed at the young, extol the heroics of Japanese soldiers and kamikaze pilots, while the Chinese and their Western allies are depicted as treacherous and belligerent. In 2008, the chief of staff of the Japanese Air Self-Defense Force stated that Japan had been “tricked” into the war by China and the US. In 2013, Prime Minister Abe Shinzo publicly doubted whether Japan’s military aggression in China could even be called an invasion.
  • The fact is that Japan is still haunted by historical issues that should have been settled decades ago. The reasons are political rather than cultural, and have to do with the pacifist constitution—written by American jurists in 1946—and with the imperial institution, absolved of war guilt by General Douglas MacArthur after the war for the sake of expediency.
  • Japan, even under Allied occupation, continued to be governed by much the same bureaucratic and political elite, albeit under a new, more democratic constitution,
  • a number of conservatives felt humiliated by what they rightly saw as an infringement of their national sovereignty. Henceforth, to them, everything from the Allied Tokyo War Crimes Tribunal to the denunciations of Japan’s war record by left-wing teachers and intellectuals would be seen in this light.
  • The more “progressive” Japanese used the history of wartime atrocities as a warning against turning away from pacifism, the more defensive right-wing politicians and commentators became about the Japanese war.
  • Views of history, in other words, were politicized—and polarized—from the beginning.
  • To take the sting out of this confrontation between constitutional pacifists and revisionists, which had led to much political turmoil in the 1950s, mainstream conservatives made a deliberate attempt to distract people’s attention from war and politics by concentrating on economic growth.
  • For several decades, the chauvinistic right wing, with its reactionary views on everything from high school education to the emperor’s status, was kept in check by the sometimes equally dogmatic Japanese left. Marxism was the prevailing ideology of the teachers union and academics.
  • the influence of Marxism waned after the collapse of the Soviet empire in the early 1990s, and the brutal records of Chairman Mao and Pol Pot became widely known.
  • Marginalized in the de facto one-party LDP state and discredited by its own dogmatism, the Japanese left did not just wane, it collapsed. This gave a great boost to the war-justifying right-wing nationalists,
  • Japanese young, perhaps out of boredom with nothing but materialistic goals, perhaps out of frustration with being made to feel guilty, perhaps out of sheer ignorance, or most probably out of a combination of all three, are not unreceptive to these patriotic blandishments.
  • Anxiety about the rise of China, whose rulers have a habit of using Japan’s historical crimes as a form of political blackmail, has boosted a prickly national pride, even at the expense of facing the truth about the past.
  • By 1996, the LDP was back in power, the constitutional issue had not been resolved, and historical debates continue to be loaded with political ideology. In fact, they are not really debates at all, but exercises in propaganda, tilted toward the reactionary side.
  • My instinct—call it a prejudice, if you prefer—before embarking on this venture was that people from distinct cultures still react quite similarly to similar circumstances.
  • The Japanese and the Germans, on the whole, did not behave in the same ways—but then the circumstances, both wartime and postwar, were quite different in the two Germanies and Japan. They still are.
  • Our comic-book prejudices turned into an attitude of moral outrage. This made life easier in a way. It was comforting to know that a border divided us from a nation that personified evil. They were bad, so we must be good. To grow up after the war in a country that had suffered German occupation was to know that one was on the side of the angels.
  • The question that obsessed us was not how we would have acquitted ourselves in uniform, going over the top, running into machine-gun fire or mustard gas, but whether we would have joined the resistance, whether we would have cracked under torture, whether we would have hidden Jews and risked deportation ourselves. Our particular shadow was not war, but occupation.
  • the frightened man who betrayed to save his life, who looked the other way, who grasped the wrong horn of a hideous moral dilemma, interested me more than the hero. This is no doubt partly because I fear I would be much like that frightened man myself. And partly because, to me, failure is more typical of the human condition than heroism.
  • I was curious to learn how Japanese saw the war, how they remembered it, what they imagined it to have been like, how they saw themselves in view of their past. What I heard and read was often surprising to a European:
  • this led me to the related subject of modern Japanese nationalism. I became fascinated by the writings of various emperor worshippers, historical revisionists, and romantic seekers after the unique essence of Japaneseness.
  • Bataan, the sacking of Manila, the massacres in Singapore, these were barely mentioned. But the suffering of the Japanese, in China, Manchuria, the Philippines, and especially in Hiroshima and Nagasaki, was remembered vividly, as was the imprisonment of Japanese soldiers in Siberia after the war. The Japanese have two days of remembrance: August 6, when Hiroshima was bombed, and August 15, the date of the Japanese surrender.
  • The curious thing was that much of what attracted Japanese to Germany before the war—Prussian authoritarianism, romantic nationalism, pseudo-scientific racialism—had lingered in Japan while becoming distinctly unfashionable in Germany. Why?
  • the two peoples saw their own purported virtues reflected in each other: the warrior spirit, racial purity, self-sacrifice, discipline, and so on. After the war, West Germans tried hard to discard this image of themselves. This was less true of the Japanese.
  • Which meant that any residual feelings of nostalgia for the old partnership in Japan were likely to be met with embarrassment in Germany.
  • I have concentrated on the war against the Jews in the case of Germany, since it was that parallel war, rather than, say, the U-boat battles in the Atlantic, or even the battle of Stalingrad, that left the most sensitive scar on the collective memory of (West) Germany.
  • I have emphasized the war in China and the bombing of Hiroshima, for these episodes, more than others, have lodged themselves, often in highly symbolic ways, in Japanese public life.
  • Do Germans perhaps have more reason to mourn? Is it because Japan has an Asian “shame culture,” to quote Ruth Benedict’s phrase, and Germany a Christian “guilt culture”?
  • why the collective German memory should appear to be so different from the Japanese. Is it cultural? Is it political? Is the explanation to be found in postwar history, or in the history of the war itself?
  • the two peoples still have anything in common after the war, it is a residual distrust of themselves.
  • when Michael sees thousands of German peace demonstrators, he does not see thousands of gentle people who have learned their lesson from the past; he sees “100 percent German Protestant rigorism, aggressive, intolerant, hard.”
  • To be betroffen implies a sense of guilt, a sense of shame, or even embarrassment. To be betroffen is to be speechless. But it also implies an idea of moral purity. To be betroffen is one way to “master the past,” to show contriteness, to confess, and to be absolved and purified.
  • In their famous book, written in the sixties, entitled The Inability to Mourn, Alexander and Margarethe Mitscherlich analyzed the moral anesthesia that afflicted postwar Germans who would not face their past. They were numbed by defeat; their memories appeared to be blocked. They would or could not do their labor, and confess. They appeared to have completely forgotten that they had glorified a leader who caused the death of millions.
  • There is something religious about the act of being betroffen, something close to Pietism,
  • heart of Pietism was the moral renovation of the individual, achieved by passing through the anguish of contrition into the overwhelming realization of the assurance of God’s grace.” Pietism served as an antidote to the secular and rational ideas of the French Enlightenment.
  • It began in the seventeenth century with the works of Philipp Jakob Spener. He wanted to reform the Church and bring the Gospel into daily life, as it were, by stressing good works and individual spiritual labor.
  • German television is rich in earnest discussion programs where people sit at round tables and debate the issues of the day. The audience sits at smaller tables, sipping drinks as the featured guests hold forth. The tone is generally serious, but sometimes the arguments get heated. It is easy to laugh at the solemnity of these programs, but there is much to admire about them. It is partly through these talk shows that a large number of Germans have become accustomed to political debate.
  • There was a real dilemma: at least two generations had been educated to renounce war and never again to send German soldiers to the front, educated, in other words, to want Germany to be a larger version of Switzerland. But they had also been taught to feel responsible for the fate of Israel, and to be citizens of a Western nation, firmly embedded in a family of allied Western nations. The question was whether they really could be both.
  • the Gulf War showed that German pacifism could not be dismissed simply as anti-Americanism or a rebellion against Adenauer’s West.
  • the West German mistrust of East Germans—the East Germans whose soldiers still marched in goose step, whose petit bourgeois style smacked of the thirties, whose system of government, though built on a pedestal of antifascism, contained so many disturbing remnants of the Nazi past; the East Germans, in short, who had been living in “Asia.”
  • Michael, the Israeli, compared the encounter of Westerners (“Wessies”) with Easterners (“Ossies”) with the unveiling of the portrait of Dorian Gray: the Wessies saw their own image and they didn’t like what they saw.
  • he added: “I also happen to think Japanese and Germans are racists.”
  • Germany for its Nazi inheritance and its sellout to the United States. But now that Germany had been reunified, with its specters of “Auschwitz” and its additional hordes of narrow-minded Ossies, Adenauer was deemed to have been right after
  • The picture was of Kiel in 1945, a city in ruins. He saw me looking at it and said: “It’s true that whoever is being bombed is entitled to some sympathy from us.”
  • “My personal political philosophy and maybe even my political ambition has to do with an element of distrust for the people I represent, people whose parents and grandparents made Hitler and the persecution of the Jews possible.”
  • in the seventies he had tried to nullify verdicts given in Nazi courts—without success until well into the eighties. One of the problems was that the Nazi judiciary itself was never purged. This continuity was broken only by time.
  • To bury Germany in the bosom of its Western allies, such as NATO and the EC, was to bury the distrust of Germans. Or so it was hoped. As Europeans they could feel normal, Western, civilized. Germany; the old “land in the middle,” the Central European colossus, the power that fretted over its identity and was haunted by its past, had become a Western nation.
  • It is a miracle, really, how quickly the Germans in the Federal Republic became civilized. We are truly part of the West now. We have internalized democracy. But the Germans of the former GDR, they are still stuck in a premodern age. They are the ugly Germans, very much like the West Germans after the war, the people I grew up with. They are not yet civilized.”
  • “I like the Germans very much, but I think they are a dangerous people. I don’t know why—perhaps it is race, or culture, or history. Whatever. But we Japanese are the same: we swing from one extreme to the other. As peoples, we Japanese, like the Germans, have strong collective discipline. When our energies are channeled in the right direction, this is fine, but when they are misused, terrible things happen.”
  • to be put in the same category as the Japanese—even to be compared—bothered many Germans. (Again, unlike the Japanese, who made the comparison often.) Germans I met often stressed how different they were from the Japanese,
  • To some West Germans, now so “civilized,” so free, so individualistic, so, well, Western, the Japanese, with their group discipline, their deference to authority, their military attitude toward work, might appear too close for comfort to a self-image only just, and perhaps only barely, overcome.
  • To what extent the behavior of nations, like that of individual people, is determined by history, culture, or character is a question that exercises many Japanese, almost obsessively.
  • not much sign of betroffenheit on Japanese television during the Gulf War. Nor did one see retired generals explain tactics and strategy. Instead, there were experts from journalism and academe talking in a detached manner about a faraway war which was often presented as a cultural or religious conflict between West and Middle East. The history of Muslim-Christian-Jewish animosity was much discussed. And the American character was analyzed at length to understand the behavior of George Bush and General Schwarzkopf.
  • In the words of one Albrecht Fürst von Urach, a Nazi propagandist, Japanese emperor worship was “the most unique fusion in the world of state form, state consciousness, and religious fanaticism.” Fanaticism was, of course, a positive word in the Nazi lexicon.
  • the identity question nags in almost any discussion about Japan and the outside world. It
  • It was a respectable view, but also one founded on a national myth of betrayal. Japan, according to the myth, had become the unique moral nation of peace, betrayed by the victors who had sat in judgment of Japan’s war crimes; betrayed in Vietnam, in Afghanistan, in Nicaragua; betrayed by the arms race, betrayed by the Cold War; Japan had been victimized not only by the “gratuitous,” perhaps even “racist,” nuclear attacks on Hiroshima and Nagasaki, but by all subsequent military actions taken by the superpowers,
  • When the Prime Minister of Japan, Shidehara Kijuro, protested in 1946 to General MacArthur that it was all very well saying that Japan should assume moral leadership in renouncing war, but that in the real world no country would follow this example, MacArthur replied: “Even if no country follows you, Japan will lose nothing. It is those who do not support this who are in the wrong.” For a long time most Japanese continued to take this view.
  • What is so convenient in the cases of Germany and Japan is that pacifism happens to be a high-minded way to dull the pain of historical guilt. Or, conversely, if one wallows in it, pacifism turns national guilt into a virtue, almost a mark of superiority, when compared to the complacency of other nations.
  • The denial of historical discrimination is not just a way to evade guilt. It is intrinsic to pacifism. To even try to distinguish between wars, to accept that some wars are justified, is already an immoral position.
  • That Kamei discussed this common paranoia in such odd, Volkish terms could mean several things: that some of the worst European myths got stuck in Japan, that the history of the Holocaust had no impact, or that Japan is in some respects a deeply provincial place. I think all three explanations apply.
  • “the problem with the U.S.-Japan relationship is difficult. A racial problem, really. Yankees are friendly people, frank people. But, you know, it’s hard. You see, we have to be friendly …”
  • Like Oda, indeed like many people of the left, Kamei thought in racial terms. He used the word jinshu, literally race. He did not even use the more usual minzoku, which corresponds, in the parlance of Japanese right-wingers, to Volk, or the more neutral kokumin, meaning the citizens of a state.
  • many Germans in the liberal democratic West have tried to deal honestly with their nation’s terrible past, the Japanese, being different, have been unable to do so. It is true that the Japanese, compared with the West Germans, have paid less attention to the suffering they inflicted on others, and shown a greater inclination to shift the blame. And liberal democracy, whatever it may look like on paper, has not been the success in Japan that it was in the German Federal Republic. Cultural differences might account for this. But one can look at these matters in a different, more political way. In his book The War Against the West, published in London in 1938, the Hungarian scholar Aurel Kolnai followed the Greeks in his definition of the West: “For the ancient Greeks ‘the West’ (or ‘Europe’) meant society with a free constitution and self-government under recognized rules, where ‘law is king,’ whereas the ‘East’ (or ‘Asia’) signified theocratic societies under godlike rulers whom their subjects serve ‘like slaves.’
  • According to this definition, both Hitler’s Germany and prewar Japan were of the East.
  • There was a great irony here: in their zeal to make Japan part of the West, General MacArthur and his advisers made it impossible for Japan to do so in spirit. For a forced, impotent accomplice is not really an accomplice at all.
  • In recent years, Japan has often been called an economic giant and a political dwarf. But this has less to do with a traditional Japanese mentality—isolationism, pacifism, shyness with foreigners, or whatnot—than with the particular political circumstances after the war that the United States helped to create.
  • when the Cold War prompted the Americans to make the Japanese subvert their constitution by creating an army which was not supposed to exist, the worst of all worlds appeared: sovereignty was not restored, distrust remained, and resentment mounted.
  • Kamei’s hawks are angry with the Americans for emasculating Japan; Oda’s doves hate the Americans for emasculating the “peace constitution.” Both sides dislike being forced accomplices, and both feel victimized, which is one reason Japanese have a harder time than Germans in coming to terms with their wartime past.
  • As far as the war against the Jews is concerned, one might go back to 1933, when Hitler came to power. Or at the latest to 1935, when the race laws were promulgated in Nuremberg. Or perhaps those photographs of burning synagogues on the night of November 9, 1938, truly marked the first stage of the Holocaust.
  • There is the famous picture of German soldiers lifting the barrier on the Polish border in 1939, but was that really the beginning? Or did it actually start with the advance into the Rhineland in 1936, or was it the annexation of the Sudetenland, or Austria, or Czechoslovakia?
  • IT IS DIFFICULT TO SAY when the war actually began for the Germans and the Japanese. I cannot think of a single image that fixed the beginning of either war in the public mind.
  • Possibly to avoid these confusions, many Germans prefer to talk about the Hitlerzeit (Hitler era) instead of “the war.”
  • only Japanese of a liberal disposition call World War II the Pacific War. People who stick to the idea that Japan was fighting a war to liberate Asia from Bolshevism and white colonialism call it the Great East Asian War (Daitowa Senso), as in the Great East Asian Co-Prosperity Sphere.
  • The German equivalent, I suppose, would be the picture of Soviet soldiers raising their flag on the roof of the gutted Reichstag in Berlin.
  • People of this opinion separate the world war of 1941–45 from the war in China, which they still insist on calling the China Incident.
  • Liberals and leftists, on the other hand, tend to splice these wars together and call them the Fifteen-Year War (1931–45).
  • images marking the end are more obvious.
  • argued that the struggle against Western imperialism actually began in 1853, with the arrival in Japan of Commodore Perry’s ships, and spoke of the Hundred-Year War.
  • These are among the great clichés of postwar Japan: shorthand for national defeat, suffering, and humiliation.
  • The Germans called it Zusammenbruch (the collapse) or Stunde Null (Zero Hour): everything seemed to have come to an end, everything had to start all over. The Japanese called it haisen (defeat) or shusen (termination of the war).
  • kokka (nation, state) and minzoku (race, people) are not quite of the same order as Sonderbehandlung (special treatment) or Einsatzgruppe (special action squad). The jargon of Japanese imperialism was racist and overblown, but it did not carry the stench of death camps.
  • The German people are spiritually starved, Adenauer told him. “The imagination has to be provided for.” This was no simple matter, especially in the German language, which had been so thoroughly infected by the jargon of mass murder.
  • All they had been told to believe in, the Germans and the Japanese, everything from the Führerprinzip to the emperor cult, from the samurai spirit to the Herrenvolk, from Lebensraum to the whole world under one (Japanese) roof, all that lay in ruins
  • How to purge this language from what a famous German philologist called the Lingua Tertii Imperii? “… the language is no longer lived,” wrote George Steiner in 1958, “it is merely spoken.”
  • out of defeat and ruin a new school of literature (and cinema) did arise. It is known in Germany as Trümmerliteratur (literature of the ruins). Japanese writers who came of age among the ruins called themselves the yakeato seidai (burnt-out generation). Much literature of the late forties and fifties was darkened by nihilism and despair.
  • It was as though Germany—Sonderweg or no Sonderweg—needed only to be purged of Nazism, while Japan’s entire cultural tradition had to be overhauled.
  • In Germany there was a tradition to fall back on. In the Soviet sector, the left-wing culture of the Weimar Republic was actively revived. In the Western sectors, writers escaped the rats and the ruins by dreaming of Goethe. His name was often invoked to prove that Germany, too, belonged to the humanist, enlightened strain of European civilization.
  • the Americans (and many Japanese leftists) distrusted anything associated with “feudalism,” which they took to include much of Japan’s premodern past. Feudalism was the enemy of democracy. So not only did the American censors, in their effort to teach the Japanese democracy, forbid sword-fight films and samurai dramas, but at one point ninety-eight Kabuki plays were banned too.
  • yet, what is remarkable about much of the literature of the period, or more precisely, of the literature about that time, since much of it was written later, is the deep strain of romanticism, even nostalgia. This colors personal memories of people who grew up just after the war as well.
  • If the mushroom cloud and the imperial radio speech are the clichés of defeat, the scene of an American soldier (usually black) raping a Japanese girl (always young, always innocent), usually in a pristine rice field (innocent, pastoral Japan), is a stock image in postwar movies about the occupation.
  • To Ango, then, as to other writers, the ruins offered hope. At last the Japanese, without “the fake kimono” of traditions and ideals, were reduced to basic human needs; at last they could feel real love, real pain; at last they would be honest. There was no room, among the ruins, for hypocrisy.
  • Böll was able to be precise about the end of the Zusammenbruch and the beginning of bourgeois hypocrisy and moral amnesia. It came on June 20, 1948, the day of the currency reform, the day that Ludwig Erhard, picked by the Americans as Economics Director in the U.S.-British occupation zone, gave birth to the Deutsche Mark. The DM, from then on, would be the new symbol of West German national pride;
  • the amnesia, and definitely the identification with the West, was helped further along by the Cold War. West Germany now found itself on the same side as the Western allies. Their common enemy was the “Asiatic” Soviet empire. Fewer questions needed to be asked.
  • Indeed, to some people the Cold War simply confirmed what they had known all along: Germany always had been on the right side, if only our American friends had realized it earlier.
  • The process of willed forgetfulness culminated in the manic effort of reconstruction, in the great rush to prosperity.
  • “Prosperity for All” was probably the best that could have happened to the Germans of the Federal Republic. It took the seed of resentment (and thus future extremism) out of defeat. And the integration of West Germany into a Western alliance was a good thing too.
  • The “inability to mourn,” the German disassociation from the piles of corpses strewn all over Central and Eastern Europe, so that the Third Reich, as the Mitscherlichs put it, “faded like a dream,” made it easier to identify with the Americans, the victors, the West.
  • Yet the disgust felt by Böll and others for a people getting fat (“flabby” is the usual term, denoting sloth and decadence) and forgetting about its murderous past was understandable.
  • The Brückners were the price Germany had to pay for the revival of its fortunes. Indeed, they were often instrumental in it. They were the apparatchik who functioned in any system, the small, efficient fish who voted for Christian conservatives in the West and became Communists in the East.
  • Staudte was clearly troubled by this, as were many Germans, but he offered no easy answers. Perhaps it was better this way: flabby democrats do less harm than vengeful old Nazis.
  • the forgetful, prosperous, capitalist Federal Republic of Germany was in many more or less hidden ways a continuation of Hitler’s Reich. This perfectly suited the propagandists of the GDR, who would produce from time to time lists of names of former Nazis who were prospering in the West. These lists were often surprisingly accurate.
  • In a famous film, half fiction, half documentary, made by a number of German writers and filmmakers (including Böll) in 1977, the continuity was made explicit. The film, called Germany in Autumn (Deutschland in Herbst),
  • Rainer Werner Fassbinder was one of the participants in this film. A year later he made The Marriage of Maria Braun.
  • To lifelong “antifascists” who had always believed that the Federal Republic was the heir to Nazi Germany, unification seemed—so they said—almost like a restoration of 1933. The irony was that many Wessies saw their new Eastern compatriots as embarrassing reminders of the same unfortunate past.
  • Rarely was the word “Auschwitz” heard more often than during the time of unification, partly as an always salutary reminder that Germans must not forget, but partly as an expression of pique that the illusion of a better, antifascist, anticapitalist, idealistic Germany, born in the ruins of 1945, and continued catastrophically for forty years in the East, had now been dashed forever.
  • Ludwig Erhard’s almost exact counterpart in Japan was Ikeda Hayato, Minister of Finance from 1949 and Prime Minister from 1960 to 1964. His version of Erhard’s “Prosperity for AH” was the Double Your Incomes policy, which promised to make the Japanese twice as rich in ten years. Japan had an average growth rate of 11 percent during the 1960s.
  • It explains, at any rate, why the unification of the two Germanys was considered a defeat by antifascists on both sides of the former border.
  • Very few wartime bureaucrats had been purged. Most ministries remained intact. Instead it was the Communists, who had welcomed the Americans as liberators, who were purged after 1949, the year China was “lost.”
  • so the time of ruins was seen by people on the left as a time of missed chances and betrayal. Far from achieving a pacifist utopia of popular solidarity, they ended up with a country driven by materialism, conservatism, and selective historical amnesia.
  • the “red purges” of 1949 and 1950 and the return to power of men whose democratic credentials were not much better helped to turn many potential Japanese friends of the United States into enemies. For the Americans were seen as promoters of the right-wing revival and the crackdown on the left.
  • For exactly twelve years Germany was in the hands of a criminal regime, a bunch of political gangsters who had started a movement. Removing this regime was half the battle.
  • It is easier to change political institutions and hope that habits and prejudices will follow. This, however, was more easily done in Germany than in Japan.
  • There had not been a cultural break either in Japan. There were no exiled writers and artists who could return to haunt the consciences of those who had stayed.
  • There was no Japanese Thomas Mann or Alfred Döblin. In Japan, everyone had stayed.
  • In Japan there was never a clear break between a fascist and a prefascist past. In fact, Japan was never really a fascist state at all. There was no fascist or National Socialist ruling party, and no Führer either. The closest thing to it would have been the emperor, and whatever else he may have been, he was not a fascist dictator.
  • whereas after the war Germany lost its Nazi leaders, Japan lost only its admirals and generals.
  • Japan was effectively occupied only by the Americans. West Germany was part of NATO and the European Community, and the GDR was in the Soviet empire. Japan’s only formal alliance is with the United States, through a security treaty that many Japanese have opposed.
  • But the systematic subservience of Japan meant that the country never really grew up. There is a Japanese fixation on America, an obsession which goes deeper, I believe, than German anti-Americanism,
  • Yet nothing had stayed entirely the same in Japan. The trouble was that virtually all the changes were made on American orders. This was, of course, the victor’s prerogative, and many changes were beneficial.
  • like in fiction. American Hijiki, a novella by Nosaka Akiyuki, is, to my mind, a masterpiece in the short history of Japanese Trümmerliteratur.
  • Older Japanese do, however, remember the occupation, the first foreign army occupation in their national history. But it was, for the Japanese, a very unusual army. Whereas the Japanese armies in Asia had brought little but death, rape, and destruction, this one came with Glenn Miller music, chewing gum, and lessons in democracy. These blessings left a legacy of gratitude, rivalry, and shame.
  • did these films teach the Japanese democracy? Oshima thinks not. Instead, he believes, Japan learned the values of “progress” and “development.” Japan wanted to be just as rich as America—no, even richer:
  • think it is a romantic assumption, based less on history than on myth; a religious notion, expressed less through scholarship than through monuments, memorials, and historical sites turned into sacred grounds.
  • The past, wrote the West German historian Christian Meier, is in our bones. “For a nation to appropriate its history,” he argued, “is to look at it through the eyes of identity.” What we have “internalized,” he concluded, is Auschwitz.
  • Auschwitz is such a place, a sacred symbol of identity for Jews, Poles, and perhaps even Germans. The question is what or whom Germans are supposed to identify with.
  • The idea that visiting the relics of history brings the past closer is usually an illusion. The opposite is more often true.
  • To visit the site of suffering, any description of which cannot adequately express the horror, is upsetting, not because one gets closer to knowing what it was actually like to be a victim, but because such visits stir up emotions one cannot trust. It is tempting to take on the warm moral glow of identification—so easily done and so presumptuous—with the victims:
  • Were the crimes of Auschwitz, then, part of the German “identity”? Was genocide a product of some ghastly flaw in German culture, the key to which might be found in the sentimental proverbs, the cruel fairy tales, the tight leather shorts?
  • yet the imagination is the only way to identify with the past. Only in the imagination—not through statistics, documents, or even photographs—do people come alive as individuals, do stories emerge, instead of History.
  • nature. It is all right to let the witnesses speak, in the courtroom, in the museums, on videotape (Claude Lanzmann’s Shoah has been shown many times on German television), but it is not all right for German artists to use their imagination.
  • the reluctance in German fiction to look Auschwitz in the face, the almost universal refusal to deal with the Final Solution outside the shrine, the museum, or the schoolroom, suggests a fear of committing sacrilege.
  • beneath the fear of bad taste or sacrilege may lie a deeper problem. To imagine people in the past as people of flesh and blood, not as hammy devils in silk capes, is to humanize them. To humanize is not necessarily to excuse or to sympathize, but it does demolish the barriers of abstraction between us and them. We could, under certain circumstances, have been them.
  • the flight into religious abstraction was to be all too common among Germans of the Nazi generation, as well as their children; not, as is so often the case with Jews, to lend mystique to a new identity, as a patriotic Zionist, but on the contrary to escape from being the heir to a peculiarly German crime, to get away from having to “internalize” Auschwitz, or indeed from being German at all.
  • a Hollywood soap opera, a work of skillful pop, which penetrated the German imagination in a way nothing had before. Holocaust was first shown in Germany in January 1979. It was seen by 20 million people, about half the adult population of the Federal Republic; 58 percent wanted it to be repeated; 12,000 letters, telegrams, and postcards were sent to the broadcasting stations; 5,200 called the stations by telephone after the first showing; 72.5 percent were positive, 7.3 percent negative.
  • “After Holocaust,” wrote a West German woman to her local television station, “I feel deep contempt for those beasts of the Third Reich. I am twenty-nine years old and a mother of three children. When I think of the many mothers and children sent to the gas chambers, I have to cry. (Even today the Jews are not left in peace. We Germans have the duty to work every day for peace in Israel.) I bow to the victims of the Nazis, and I am ashamed to be a German.”
  • Auschwitz was a German crime, to be sure. “Death is a master from Germany.” But it was a different Germany. To insist on viewing history through the “eyes of identity,” to repeat the historian Christian Meier’s phrase, is to resist the idea of change.
  • Is there no alternative to these opposing views? I believe there is.
  • The novelist Martin Walser, who was a child during the war, believes, like Meier, that Auschwitz binds the German people, as does the language of Goethe. When a Frenchman or an American sees pictures of Auschwitz, “he doesn’t have to think: We human beings! He can think: Those Germans! Can we think: Those Nazis! I for one cannot …”
  • Adorno, a German Jew who wished to save high German culture, on whose legacy the Nazis left their bloody finger marks, resisted the idea that Auschwitz was a German crime. To him it was a matter of modern pathology, the sickness of the “authoritarian personality,” of the dehumanized SS guards, those inhumane cogs in a vast industrial wheel.
  • To the majority of Japanese, Hiroshima is the supreme symbol of the Pacific War. All the suffering of the Japanese people is encapsulated in that almost sacred word: Hiroshima. But it is more than a symbol of national martyrdom; Hiroshima is a symbol of absolute evil, often compared to Auschwitz.
  • has the atmosphere of a religious center. It has martyrs, but no single god. It has prayers, and it has a ready-made myth about the fall of man. Hiroshima, says a booklet entitled Hiroshima Peace Reader, published by the Hiroshima Peace Culture Foundation, “is no longer merely a Japanese city. It has become recognized throughout the world as a Mecca of world peace.”
  • They were not enshrined in the Japanese park, and later attempts by local Koreans to have the monument moved into Peace Park failed. There could only be one cenotaph, said the Hiroshima municipal authorities. And the cenotaph did not include Koreans.
  • What is interesting about Hiroshima—the Mecca rather than the modern Japanese city, which is prosperous and rather dull—is the tension between its universal aspirations and its status as the exclusive site of Japanese victimhood.
  • it is an opinion widely held by Japanese nationalists. The right always has been concerned with the debilitating effects on the Japanese identity of war guilt imposed by American propaganda.
  • The Japanese, in contrast, were duped by the Americans into believing that the traces of Japanese suffering should be swept away by the immediate reconstruction of Hiroshima. As a result, the postwar Japanese lack an identity and their racial virility has been sapped by American propaganda about Japanese war guilt.
  • Hiroshima, Uno wrote, should have been left as it was, in ruins, just as Auschwitz, so he claims, was deliberately preserved by the Jews. By reminding the world of their martyrdom, he said, the Jews have kept their racial identity intact and restored their virility.
  • But the idea that the bomb was a racist experiment is less plausible, since the bomb was developed for use against Nazi Germany.
  • There is another view, however, held by leftists and liberals, who would not dream of defending the “Fifteen-Year War.” In this view, the A-bomb was a kind of divine punishment for Japanese militarism. And having learned their lesson through this unique suffering, having been purified through hellfire and purgatory, so to speak, the Japanese people have earned the right, indeed have the sacred duty, to sit in judgment of others, specifically the United States, whenever they show signs of sinning against the “Hiroshima spirit.”
  • The left has its own variation of Japanese martyrdom, in which Hiroshima plays a central role. It is widely believed, for instance, that countless Japanese civilians fell victim to either a wicked military experiment or to the first strike in the Cold War, or both.
  • However, right-wing nationalists care less about Hiroshima than about the idée fixe that the “Great East Asian War” was to a large extent justified.
  • This is at the heart of what is known as Peace Education, which has been much encouraged by the leftist Japan Teachers’ Union and has been regarded with suspicion by the conservative government. Peace Education has traditionally meant pacifism, anti-Americanism, and a strong sympathy for Communist states, especially China.
  • The A-bomb, in this version, was dropped to scare the Soviets away from invading Japan. This at least is an arguable position.
  • left-wing pacifism in Japan has something in common with the romantic nationalism usually associated with the right: it shares the right’s resentment about being robbed by the Americans of what might be called a collective memory.
  • The romantic pacifists believe that the United States, to hide its own guilt and to rekindle Japanese militarism in aid of the Cold War, tried to wipe out the memory of Hiroshima.
  • few events in World War II have been described, analyzed, lamented, reenacted, re-created, depicted, and exhibited so much and so often as the bombing of Hiroshima
  • The problem with Nagasaki was not just that Hiroshima came first but also that Nagasaki had more military targets than Hiroshima. The Mitsubishi factories in Nagasaki produced the bulk of Japanese armaments. There was also something else, which is not often mentioned: the Nagasaki bomb exploded right over the area where outcasts and Christians lived. And unlike in Hiroshima, much of the rest of the city was spared the worst.
  • yet, despite these diatribes, the myth of Hiroshima and its pacifist cult is based less on American wickedness than on the image of martyred innocence and visions of the apocalypse.
  • The comparison between Hiroshima and Auschwitz is based on this notion; the idea, namely, that Hiroshima, like the Holocaust, was not part of the war, not even connected with it, but “something that occurs at the end of the world
  • still I wonder whether it is really so different from the position of many Germans who wish to “internalize” Auschwitz, who see Auschwitz “through the eyes of identity.”
  • the Japanese to take two routes at once, a national one, as unique victims of the A-bomb, and a universal one, as the apostles of the Hiroshima spirit. This, then, is how Japanese pacifists, engaged in Peace Education, define the Japanese identity.
  • the case for Hiroshima is at least open to debate. The A-bomb might have saved lives; it might have shortened the war. But such arguments are incompatible with the Hiroshima spirit.
  • In either case, nationality has come to be based less on citizenship than on history, morality, and a religious spirit.
  • The problem with this quasi-religious view of history is that it makes it hard to discuss past events in anything but nonsecular terms. Visions of absolute evil are unique, and they are beyond human explanation or even comprehension. To explain is hubristic and amoral.
  • in the history of Japan’s foreign wars, the city of Hiroshima is far from innocent. When Japan went to war with China in 1894, the troops set off for the battlefronts from Hiroshima, and the Meiji emperor moved his headquarters there. The city grew wealthy as a result. It grew even wealthier when Japan went to war with Russia eleven years later, and Hiroshima once again became the center of military operations. As the Hiroshima Peace Reader puts it with admirable conciseness, “Hiroshima, secure in its position as a military city, became more populous and prosperous as wars and incidents occurred throughout the Meiji and Taisho periods.” At the time of the bombing, Hiroshima was the base of the Second General Headquarters of the Imperial Army (the First was in Tokyo). In short, the city was swarming with soldiers. One of the few literary masterpieces to emerge
  • when a local group of peace activists petitioned the city of Hiroshima in 1987 to incorporate the history of Japanese aggression into the Peace Memorial Museum, the request was turned down. The petition for an “Aggressors’ Corner” was prompted by junior high school students from Osaka, who had embarrassed Peace Museum officials by asking for an explanation about Japanese responsibility for the war.
  • Yukoku Ishinkai (Society for Lament and National Restoration), thought the bombing had saved Japan from total destruction. But he insisted that Japan could not be held solely responsible for the war. The war, he said, had simply been part of the “flow of history.”
  • They also demanded an official recognition of the fact that some of the Korean victims of the bomb had been slave laborers. (Osaka, like Kyoto and Hiroshima, still has a large Korean population.) Both requests were denied. So a group called Peace Link was formed, from local people, many of whom were Christians, antinuclear activists, or involved with discriminated-against minorities.
  • The history of the war, or indeed any history, is indeed not what the Hiroshima spirit is about. This is why Auschwitz is the only comparison that is officially condoned. Anything else is too controversial, too much part of the “flow of history.”
  • “You see, this museum was not really intended to be a museum. It was built by survivors as a place of prayer for the victims and for world peace. Mankind must build a better world. That is why Hiroshima must persist. We must go back to the basic roots. We must think of human solidarity and world peace. Otherwise we just end up arguing about history.”
  • Only when a young Japanese history professor named Yoshimi Yoshiaki dug up a report in American archives in the 1980s did it become known that the Japanese had stored 15,000 tons of chemical weapons on and near the island and that a 200-kilogram container of mustard gas was buried under Hiroshima.
  • what was the largest toxic gas factory in the Japanese Empire. More than 5,000 people worked there during the war, many of them women and schoolchildren. About 1,600 died of exposure to hydrocyanic acid gas, nausea gas, and lewisite. Some were damaged for life. Official Chinese sources claim that more than 80,000 Chinese fell victim to gases produced at the factory. The army was so secretive about the place that the island simply disappeared from Japanese maps.
  • in 1988, through the efforts of survivors, the small museum was built, “to pass on,” in the words of the museum guide, “the historical truth to future generations.”
  • Surviving workers from the factory, many of whom suffered from chronic lung diseases, asked for official recognition of their plight in the 1950s. But the government turned them down. If the government had compensated the workers, it would have been an official admission that the Japanese Army had engaged in an illegal enterprise. When a brief mention of chemical warfare crept into Japanese school textbooks, the Ministry of Education swiftly took it out.
  • I asked him about the purpose of the museum. He said: “Before shouting ‘no more war,’ I want people to see what it was really like. To simply look at the past from the point of view of the victim is to encourage hatred.”
  • “Look,” he said, “when you fight another man, and hit him and kick him, he will hit and kick back. One side will win. How will this be remembered? Do we recall that we were kicked, or that we started the kicking ourselves? Without considering this question, we cannot have peace.”
  • The fact that Japanese had buried poison gas under Hiroshima did not lessen the horror of the A-bomb. But it put Peace Park, with all its shrines, in a more historical perspective. It took the past away from God and put it in the fallible hands of man.
  • What did he think of the Peace Museum in Hiroshima? “At the Hiroshima museum it is easy to feel victimized,” he said. “But we must realize that we were aggressors too. We were educated to fight for our country. We made toxic gas for our country. We lived to fight the war. To win the war was our only goal.”
  • Nanking, as the capital of the Nationalist government, was the greatest prize in the attempted conquest of China. Its fall was greeted in Japan with banner headlines and nationwide celebration. For six weeks Japanese Army officers allowed their men to run amok. The figures are imprecise, but tens of thousands, perhaps hundreds of thousands (the Chinese say 300,000) of Chinese soldiers and civilians, many of them refugees from other towns, were killed. And thousands of women between the ages of about nine and seventy-five were raped, mutilated, and often murdered.
  • Was it a deliberate policy to terrorize the Chinese into submission? The complicity of the officers suggests there was something to this. But it might also have been a kind of payoff to the Japanese troops for slogging through China in the freezing winter without decent pay or rations. Or was it largely a matter of a peasant army running out of control? Or just the inevitable consequence of war, as many Japanese maintain?
  • inevitable cruelty of war. An atrocity is a willful act of criminal brutality, an act that violates the law as well as any code of human decency. It isn’t that the Japanese lack such codes or are morally incapable of grasping the concept. But “atrocity,” like “human rights,” is part of a modern terminology which came from the West, along with “feminism,” say, or “war crimes.” To right-wing nationalists it has a leftist ring, something subversive, something almost anti-Japanese.
  • During the Tokyo War Crimes Tribunal, Nanking had the same resonance as Auschwitz had in Nuremberg. And being a symbol, the Nanking Massacre is as vulnerable to mythology and manipulation as Auschwitz and Hiroshima.
  • Mori’s attitude also raises doubts about Ruth Benedict’s distinction between Christian “guilt culture” and Confucian “shame culture.”
  • In her opinion, a “society that inculcates absolute standards of morality and relies on man’s developing a conscience is a guilt culture by definition …” But in “a culture where shame is a major sanction, people are chagrined about acts which we expect people to feel guilty about.” However, this “chagrin cannot be relieved, as guilt can be, by confession and atonement …”
  • memory was admitted at all, the Mitscherlichs wrote about Germans in the 1950s, “it was only in order to balance one’s own guilt against that of others. Many horrors had been unavoidable, it was claimed, because they had been dictated by crimes committed by the adversary.” This was precisely what many Japanese claimed, and still do claim. And it is why Mori insists on making his pupils view the past from the perspective of the aggressors.
  • Two young Japanese officers, Lieutenant N. and Lieutenant M., were on their way to Nanking and decided to test their swordsmanship: the first to cut off one hundred Chinese heads would be the winner. And thus they slashed their way through Chinese ranks, taking scalps in true samurai style. Lieutenant M. got 106, and Lieutenant N. bagged 105.
  • The story made a snappy headline in a major Tokyo newspaper: “Who Will Get There First! Two Lieutenants Already Claimed 80.” In the Nanking museum is a newspaper photograph of the two friends, glowing with youthful high spirits. Lieutenant N. boasted in the report that he had cut the necks off 56 men without even denting the blade of his ancestral sword.
  • I was told by a Japanese veteran who had fought in Nanking that such stories were commonly made up or at least exaggerated by Japanese reporters, who were ordered to entertain the home front with tales of heroism.
  • Honda Katsuichi, a famous Asahi Shimbun reporter, was told the story in Nanking. He wrote it up in a series of articles, later collected in a book entitled A Journey to China, published in 1981.
  • the whole thing developed into the Nankin Ronso, or Nanking Debate. In 1984, an anti-Honda book came out, by Tanaka Masaaki, entitled The Fabrication of the “Nanking Massacre.”
  • back in Japan, Lieutenant M. began to revise his story. Speaking at his old high school, he said that in fact he had beheaded only four or five men in actual combat. As for the rest … “After we occupied the city, I stood facing a ditch, and told the Chinese prisoners to step forward. Since Chinese soldiers are stupid, they shuffled over to the ditch, one by one, and I cleanly cut off their heads.”
  • The nationalist intellectuals are called goyo gakusha by their critics. It is a difficult term to translate, but the implied meaning is “official scholars,” who do the government’s bidding.
  • the debate on the Japanese war is conducted almost entirely outside Japanese universities, by journalists, amateur historians, political columnists, civil rights activists, and so forth. This means that the zanier theories of the likes of Tanaka…
  • The other reason was that modern history was not considered academically respectable. It was too fluid, too political, too controversial. Until 1955, there was not one modern historian on the staff of Tokyo University. History stopped around the middle of the nineteenth century. And even now, modern…
  • In any case, so the argument invariably ends, Hiroshima, having been planned in cold blood, was a far worse crime. “Unlike in Europe or China,” writes Tanaka, “you won’t find one instance of planned, systematic murder in the entire history of Japan.” This is because the Japanese…
  • One reason is that there are very few modern historians in Japan. Until the end of the war, it would have been dangerously subversive, even blasphemous, for a critical scholar to write about modern…
  • they have considerable influence on public opinion, as television commentators, lecturers, and contributors to popular magazines. Virtually none of them are professional historians.
  • Tanaka and others have pointed out that it is physically impossible for one man to cut off a hundred heads with one blade, and that for the same reason Japanese troops could never have…
  • Besides, wrote Tanaka, none of the Japanese newspapers reported any massacre at the time, so why did it suddenly come up…
  • He admits that a few innocent people got killed in the cross fire, but these deaths were incidental. Some soldiers were doubtless a bit rough, but…
  • even he defends an argument that all the apologists make too: “On the battlefield men face the ultimate extremes of human existence, life or death. Extreme conduct, although still ethically…
  • atrocities carried out far from the battlefield dangers and imperatives and according to a rational plan were acts of evil barbarism. The Auschwitz gas chambers of our ‘ally’ Germany and the atomic bombing of our…
  • The point that it was not systematic was made by leftist opponents of the official scholars too. The historian Ienaga Saburo, for example, wrote that the Nanking Massacre, whose scale and horror he does not deny, “may have been a reaction to the fierce Chinese resistance after the Shanghai fighting.” Ienaga’s…
  • The nationalist right takes the opposite view. To restore the true identity of Japan, the emperor must be reinstated as a religious head of state, and Article Nine must be revised to make Japan a legitimate military power again. For this reason, the Nanking Massacre, or any other example of extreme Japanese aggression, has to be ignored, softened, or denied.
  • the question remains whether the raping and killing of thousands of women, and the massacre of thousands, perhaps hundreds of thousands, of other unarmed people, in the course of six weeks, can still be called extreme conduct in the heat of battle. The question is pertinent, particularly when such extreme violence is justified by an ideology which teaches the aggressors that killing an inferior race is in accordance with the will of their divine emperor.
  • The politics behind the symbol are so divided and so deeply entrenched that it hinders a rational historical debate about what actually happened in 1937. The more one side insists on Japanese guilt, the more the other insists on denying it.
  • The Nanking Massacre, for leftists and many liberals too, is the main symbol of Japanese militarism, supported by the imperial (and imperialist) cult. Which is why it is a keystone of postwar pacifism. Article Nine of the constitution is necessary to avoid another Nanking Massacre.
  • The Japanese, he said, should see their history through their own eyes, for “if we rely on the information of aliens and alien countries, who use history for the sake of propaganda, then we are in danger of losing the sense of our own history.” Yet another variation of seeing history through the eyes of identity.
  • their emotions were often quite at odds with the idea of “shame culture” versus “guilt culture.” Even where the word for shame, hazukashii, was used, its meaning was impossible to distinguish from the Western notion of guilt.
  • wasn’t so bad in itself. But then they killed them. You see, rape was against military regulations, so we had to destroy the evidence. While the women were fucked, they were considered human, but when we killed them, they were just pigs. We felt no shame about it, no guilt. If we had, we couldn’t have done it.
  • “Whenever we would enter a village, the first thing we’d do was steal food, then we’d take the women and rape them, and finally we’d kill all the men, women, and children to make sure they couldn’t slip away and tell the Chinese troops where we were. Otherwise we wouldn’t have been able to sleep at night.”
  • Clearly, then, the Nanking Massacre had been the culmination of countless massacres on a smaller scale. But it had been mass murder without a genocidal ideology. It was barbaric, but to Azuma and his comrades, barbarism was part of war.
  • “Sexual desire is human,” he said. “Since I suffered from a venereal disease, I never actually did it with Chinese women. But I did peep at their private parts. We’d always order them to drop their trousers. They never wore any underwear, you know. But the others did it with any woman that crossed our path.
  • He did have friends, however, who took part in the killings. One of them, Masuda Rokusuke, killed five hundred men by the Yangtze River with his machine gun. Azuma visited his friend in the hospital just before he died in the late 1980s. Masuda was worried about going to hell. Azuma tried to reassure him that he was only following orders. But Masuda remained convinced that he was going to hell.
  • “One of the worst moments I can remember was the killing of an old man and his grandson. The child was bayoneted and the grandfather started to suck the boy’s blood, as though to conserve his grandson’s life a bit longer. We watched a while and then killed both. Again, I felt no guilt, but I was bothered by this kind of thing. I felt confused. So I decided to keep a diary. I thought it might help me think straight.”
  • What about his old comrades? I asked. How did they discuss the war? “Oh,” said Azuma, “we wouldn’t talk about it much. When we did, it was to justify it. The Chinese resisted us, so we had to do what we did, and so on. None of us felt any remorse. And I include myself.”
  • got more and more agitated. “They turned the emperor into a living god, a false idol, like the Ayatollah in Iran or like Kim II Sung. Because we believed in the divine emperor, we were prepared to do anything, anything at all, kill, rape, anything. But I know he fucked his wife every night, just like we do …” He paused and lowered his voice. “But you know we cannot say this in Japan, even today. It is impossible in this country to tell the truth.”
  • My first instinct was to applaud West German education. Things had come a long way since 1968. There had been no school classes at Nuremberg, or even at the Auschwitz trial in Frankfurt from 1963 till 1965. Good for the teacher, I thought. Let them hear what was done. But I began to have doubts.
  • Just as belief belongs in church, surely history education belongs in school. When the court of law is used for history lessons, then the risk of show trials cannot be far off. It may be that show trials can be good politics—though I have my doubts about this too. But good politics don’t necessarily serve the truth.
  • There is a story about the young Richard when he was in Nuremberg at the time of the war crimes trials. He is said to have turned to a friend and to have remarked, in his best Wehrmacht officer style, that they should storm the court and release the prisoners. The friend, rather astonished, asked why on earth they should do such a thing. “So that we can try them ourselves” was Weiszäcker’s alleged response.
  • There was also concern that international law might not apply to many of the alleged crimes. If revenge was the point, why drag the law into it? Why not take a political decision to punish? This was what Becker, in his office, called the Italian solution: “You kill as many people as you can in the first six weeks, and then you forget about it: not very legal, but for the purposes of purification, well …”
  • Becker was not against holding trials as such. But he believed that existing German laws should have been applied, instead of retroactive laws about crimes against peace (preparing, planning, or waging an aggressive war).
  • It was to avoid a travesty of the legal process that the British had been in favor of simply executing the Nazi leaders without a trial. The British were afraid that a long trial might change public opinion. The trial, in the words of one British diplomat, might be seen as a “put-up job.”
  • The question is how to achieve justice without distorting the law, and how to stage a trial by victors over the vanquished without distorting history. A possibility would have been to make victors’ justice explicit, by letting military courts try the former enemies.
  • This would have avoided much hypocrisy and done less damage to the due process of law in civilian life. But if the intention was to teach Germans a history lesson, a military court would have run into the same problems as a civilian one.
  • Due process or revenge. This problem had preoccupied the ancient Greek tragedians. To break the cycle of vendetta, Orestes had to be tried by the Athens court for the murder of his mother. Without a formal trial, the vengeful Furies would continue to haunt the living.
  • The aspect of revenge might have been avoided had the trial been held by German judges. There was a precedent for this, but it was not a happy one. German courts had been allowed to try alleged war criminals after World War I. Despite strong evidence against them, virtually all were acquitted, and the foreign delegates were abused by local mobs. Besides, Wetzka was right: German judges had collaborated with the Nazi regime; they could hardly be expected to be impartial. So it was left to the victors to see that justice was done.
  • When the American chief prosecutor in Nuremberg, Robert H. Jackson, was asked by the British judge, Lord Justice Lawrence, what he thought the purpose of the trials should be, Jackson answered that they were to prove to the world that the German conduct of the war had been unjustified and illegal, and to demonstrate to the German people that this conduct deserved severe punishment and to prepare them for
  • What becomes clear from this kind of language is that law, politics, and religion became confused: Nuremberg became a morality play, in which Göring, Kaltenbrunner, Keitel, and the others were cast in the leading roles. It was a play that claimed to deliver justice, truth, and the defeat of evil.
  • The Nuremberg trials were to be a history lesson, then, as well as a symbolic punishment of the German people—a moral history lesson cloaked in all the ceremonial trappings of due legal process. They were the closest that man, or at least the men belonging to the victorious powers, could come to dispensing divine justice. This was certainly the way some German writers felt about it. Some welcomed it
  • We now have this law on our books, the prosecutor said: “It will be used against the German aggressor this time. But the four powers, who are conducting this trial in the name of twenty-three nations, know this law and declare: Tomorrow we shall be judged before history by the same yardstick by which we judge these defendants today.”
  • “We had seen through the amorality of the Nazis, and wanted to rid ourselves of it. It was from the moral seriousness of the American prosecution that we wished to learn sensible political thinking. “And we did learn. “And we allowed ourselves to apply this thinking to the present time. For example, we will use it now to take quite literally the morality of those American prosecutors. Oradour and Lidice—today they are cities in South Vietnam” (Italics in the original text.)
  • The play ends with a statement by the American prosecutor on crimes against peace
  • (It was decided in 1979, after the shock of the Holocaust TV series, to abolish the statute of limitations for crimes against humanity.)
  • after Nuremberg, most Germans were tired of war crimes. And until the mid-1950s German courts were permitted to deal only with crimes committed by Germans against other Germans. It took the bracing example of the Eichmann trial in Jerusalem to jolt German complacency—that, and the fact that crimes committed before 1946 would no longer be subject to prosecution after 1965.
  • Trying the vanquished for conventional war crimes was never convincing, since the victors could be accused of the same. Tu quoque could be invoked, in private if not in the Nuremberg court, when memories of Dresden and Soviet atrocities were still fresh. But Auschwitz had no equivalent. That was part of another war, or, better, it was not really a war at all; it was mass murder pure and simple, not for reasons of strategy or tactics, but of ideology alone.
  • Whether you are a conservative who wants Germany to be a “normal” nation or a liberal/leftist engaging in the “labor of mourning,” the key event of World War II is Auschwitz, not the Blitzkrieg, not Dresden, not even the war on the eastern front. This was the one history lesson of Nuremberg that stuck. As Hellmut Becker said, despite his skepticism about Nuremberg: “It was most important that the German population realized that crimes against humanity had taken place and that during the trials it became clear how they had taken place.”
  • In his famous essay on German guilt, Die Schuldfrage (The Question of German Guilt), written in 1946, Karl Jaspers distinguished four categories of guilt: criminal guilt, for breaking the law; political guilt, for being part of a criminal political system; moral guilt, for personal acts of criminal behavior; and metaphysical guilt, for failing in one’s responsibility to maintain the standards of civilized humanity. Obviously these categories overlap.
  • The great advantage, in his view, of a war crimes trial was its limitation. By allowing the accused to defend themselves with arguments, by laying down the rules of due process, the victors limited their own powers.
  • In any event, the trial distanced the German people even further from their former leaders. It was a comfortable distance, and few people had any desire to bridge it. This might be why the Nazi leaders are hardly ever featured in German plays, films, or novels.
  • And: “For us Germans this trial has the advantage that it distinguishes between the particular crimes of the leaders and that it does not condemn the Germans collectively.”
  • Serious conservative intellectuals, such as Hermann Lübbe, argued that too many accusations would have blocked West Germany’s way to becoming a stable, prosperous society. Not that Lübbe was an apologist for the Third Reich. Far from it: the legitimacy of the Federal Republic, in his opinion, lay in its complete rejection of the Nazi state.
  • their reaction was often one of indignation. “Why me?” they would say. “I just did my duty. I just followed orders like every decent German. Why must I be punished?”
  • “that these criminals were so like all of us at any point between 1918 and 1945 that we were interchangeable, and that particular circumstances caused them to take a different course, which resulted in this trial, these matters could not be properly discussed in the courtroom.” The terrible acts of individuals are lifted from their historical context. History is reduced to criminal pathology and legal argument.
  • they will not do as history lessons, nor do they bring us closer to that elusive thing that Walser seeks, a German identity.
  • The GDR had its own ways of using courts of law to deal with the Nazi past. They were in many respects the opposite of West German ways. The targets tended to be the very people that West German justice had ignored.
  • Thorough purges took place in the judiciary, the bureaucracy, and industry. About 200,000 people—four-fifths of the Nazi judges and prosecutors—lost their jobs. War crimes trials were held too; until 1947 by the Soviets, after that in German courts.
  • There were two more before 1957, and none after that. All in all, about 30,000 people had been tried and 500 executed. In the Federal Republic the number was about 91,000, and none were executed, as the death penalty was abolished by the 1949 constitution.
  • East German methods were both ruthless and expedient, and the official conclusion to the process was that the GDR no longer had to bear the burden of guilt. As state propaganda ceaselessly pointed out, the guilty were all in the West. There the fascists still sat as judges and ran the industries that produced the economic boom, the Wirtschaftswunder.
  • society. Although some of his critics, mostly on the old left, in both former Germanys, called him a grand inquisitor, few doubted the pastor’s good intentions. His arguments for trials were moral, judicial, and historical. He set out his views in a book entitled The Stasi Documents. Echoes of an earlier past rang through almost every page. “We can
  • Germany of the guilty, the people who felt betroffen by their own “inability to mourn,” the nation that staged the Auschwitz and Majdanek trials, that Germany was now said to stand in judgment over the other Germany—the Germany of the old antifascists, the Germany that had suffered under two dictatorships, the Germany of uniformed marches, goose-stepping drills, and a secret police network, vast beyond even the Gestapo’s dreams.
  • It is almost a form of subversion to defend a person who stands accused in court. So the idea of holding political and military leaders legally accountable for their actions was even stranger in Japan than it was in Germany. And yet, the shadows thrown by the Tokyo trial have been longer and darker in Japan than those of the Nuremberg trial in Germany.
  • never was—unlike, say, the railway station or the government ministry—a central institution of the modern Japanese state. The law was not a means to protect the people from arbitrary rule; it was, rather, a way for the state to exercise more control over the people. Even today, there are relatively few lawyers in Japan.
  • Japanese school textbooks are the product of so many compromises that they hardly reflect any opinion at all. As with all controversial matters in Japan, the more painful, the less said. In a standard history textbook for middle school students, published in the 1980s, mention of the Tokyo trial takes up less than half a page. All it says is that the trial…
  • As long as the British and the Americans continued to be oppressors in Asia, wrote a revisionist historian named Hasegawa Michiko, who was born in 1945, “confrontation with Japan was inevitable. We did not fight for Japan alone. Our aim was to fight a Greater East Asia War. For this reason the war between Japan and China and Japan’s oppression of…
  • West German textbooks describe the Nuremberg trial in far more detail. And they make a clear distinction between the retroactive law on crimes against peace and the…
  • Nationalist revisionists talk about “the Tokyo Trial View of History,” as though the conclusions of the tribunal had been nothing but rabid anti-Japanese propaganda. The tribunal has been called a lynch mob, and Japanese leftists are blamed for undermining the morale of generations of Japanese by passing on the Tokyo Trial View of History in school textbooks and liberal publications. The Tokyo Trial…
  • When Hellmut Becker said that few Germans wished to criticize the procedures of the Nuremberg trial because the criminality of the defendants was so plain to see, he was talking about crimes against humanity—more precisely, about the Holocaust. And it was…
  • The knowledge compiled by the doctors of Unit 731—of freezing experiments, injection of deadly diseases, vivisections, among other things—was considered so valuable by the Americans in 1945 that the doctors…
  • those aspects of the war that were most revolting and furthest removed from actual combat, such as the medical experiments on human guinea pigs (known as “logs”) carried out by Unit 731 in…
  • There never were any Japanese war crimes trials, nor is there a Japanese Ludwigsburg. This is partly because there was no exact equivalent of the Holocaust. Even though the behavior of Japanese troops was often barbarous, and the psychological consequences of State Shinto and emperor worship were frequently as hysterical as Nazism, Japanese atrocities were part of a…
  • This difference between (West) German and Japanese textbooks is not just a matter of detail; it shows a gap in perception. To the Japanese, crimes against humanity are not associated with an equivalent to the…
  • on what grounds would Japanese courts have prosecuted their own former leaders? Hata’s answer: “For starting a war which they knew they would lose.” Hata used the example of General Galtieri and his colleagues in Argentina after losing the Falklands War. In short, they would have been tried for losing the war, and the intense suffering they inflicted on their own people. This is as though German courts in 1918 had put General Hindenburg or General Ludendorff on trial.
  • it shows yet again the fundamental difference between the Japanese war, in memory and, I should say, in fact, and the German experience. The Germans fought a war too, but the one for which they tried their own people, the Bogers and the Schwammbergers, was a war they could not lose, unless defeat meant that some of the enemies survived.
  • Just as German leftists did in the case of Nuremberg, Kobayashi used the trial to turn the tables against the judges. But not necessarily to mitigate Japanese guilt. Rather, it was his intention to show how the victors had betrayed the pacifism they themselves had imposed on Japan.
  • the Japanese left has a different view of the Tokyo trial than the revisionist right. It is comparable to the way the German left looks upon Nuremberg. This was perfectly, if somewhat long-windedly, expressed in Kobayashi Masaki’s documentary film Tokyo Trial, released in 1983. Kobayashi is anything but an apologist for the Japanese war. His most famous film, The Human Condition, released in 1959, took a highly critical view of the war.
  • Yoshimoto’s memory was both fair and devastating, for it pointed straight at the reason for the trial’s failure. The rigging of a political trial—the “absurd ritual”—undermined the value of that European idea of law.
  • Yoshimoto went on to say something no revisionist would ever mention: “I also remember my fresh sense of wonder at this first encounter with the European idea of law, which was so different from the summary justice in our Asiatic courts. Instead of getting your head chopped off without a proper trial, the accused were able to defend themselves, and the careful judgment appeared to follow a public procedure.”
  • Yoshimoto Takaaki, philosopher of the 1960s New Left. Yet he wrote in 1986 that “from our point of view as contemporaries and witnesses, the trial was partly plotted from the very start. It was an absurd ritual before slaughtering the sacrificial lamb.”
  • This, from all accounts, was the way it looked to most Japanese, even if they had little sympathy for most of the “lambs.” In 1948, after three years of American occupation censorship and boosterism, people listened to the radio broadcast of the verdicts with a sad but fatalist shrug: this is what you can expect when you lose the war.
  • Some of the information even surprised the defendants. General Itagaki Seishiro, a particularly ruthless figure, who was in command of prison camps in Southeast Asia and whose troops had massacred countless Chinese civilians, wrote in his diary: “I am learning of matters I had not known and recalling things I had forgotten.”
  • hindsight, one can only conclude that instead of helping the Japanese to understand and accept their past, the trial left them with an attitude of cynicism and resentment.
  • After it was over, the Nippon Times pointed out the flaws of the trial, but added that “the Japanese people must ponder over why it is that there has been such a discrepancy between what they thought and what the rest of the world accepted almost as common knowledge. This is at the root of the tragedy which Japan brought upon herself.”
  • Political trials produce politicized histories. This is what the revisionists mean when they talk about the Tokyo Trial View of History. And they are right, even if their own conclusions are not.
  • Frederick Mignone, one of the prosecutors, said a trifle histrionically that “in Japan and in the Orient in general, the trial is one of the most important phases of the occupation. It has received wide coverage in the Japanese press and revealed for the first time to millions of Japanese the scheming, duplicity, and insatiable desire for power of her entrenched militaristic leaders, writing a much-needed history of events which otherwise would not have been written.” It was indeed much-needed, since so little was known.
  • The president of the Tokyo tribunal, Sir William Webb, thought “the crimes of the German accused were far more heinous, varied and extensive than those of the Japanese accused.” Put in another way, nearly all the defendants at Nuremberg, convicted of crimes against peace, were also found guilty of crimes against humanity. But half the Japanese defendants received life sentences for political crimes only.
  • the question of responsibility is always a tricky affair in Japan, where formal responsibility is easier to identify than actual guilt. Not only were there many men, such as the hero of Kinoshita’s play, who took the blame for what their superiors had done—a common practice in Japan, in criminal gangs as well as in politics or business corporations—but the men at the top were often not at all in control of their unscrupulous subordinates.
  • “These men were not the hoodlums who were the powerful part of the group which stood before the tribunal at Nuremberg, dregs of a criminal environment, thoroughly schooled in the ways of crime and knowing no other methods but those of crime. These men were supposed to be the elite of the nation, the honest and trusted leaders to whom the fate of the nation had been confidently entrusted
  • many people were wrongly accused of the wrong things for the wrong reasons. This is why there was such sympathy in Japan for the men branded by foreigners as war criminals, particularly the so-called Class B and Class C criminals, the men who followed orders, or gave them at a lower level: field commanders, camp guards, and so on.
  • “The Japanese people are of the opinion that the actual goal of the war crimes tribunals was never realized, since the judgments were reached by the victors alone and had the character of revenge. The [Japanese] war criminal is not conscious of having committed a crime, for he regards his deeds as acts of war, committed out of patriotism.”
  • Yamashita Tomoyuki. Terrible atrocities were committed under his command in the Philippines. The sacking of Manila in 1945 was about as brutal as the Nanking Massacre. So to depict him in the movie as a peaceful gentleman, while portraying the American prosecutor in Manila as one of the main villains, might seem an odd way to view the past.
  • The Shrine ranks highest. It is the supreme symbol of authority, shouldered (like a shrine on festival days) by the Officials.
  • The political theorist Maruyama Masao called the prewar Japanese government a “system of irresponsibilities.” He identified three types of political personalities: the portable Shrine, the Official, and the Outlaw.
  • those who carry it, the Officials, are the ones with actual power. But the Officials—bureaucrats, politicians, admirals and generals—are often manipulated by the lowest-ranking Outlaws, the military mavericks, the hotheaded officers in the field, the mad nationalists, and other agents of violence.
  • But it was not entirely wrong, for the trial was rigged. Yamashita had no doubt been a tough soldier, but in this case he had been so far removed from the troops who ran amok in Manila that he could hardly have known what was going on. Yet the American prosecutor openly talked about his desire to hang “Japs.”
  • When the system spins out of control, as it did during the 1930s, events are forced by violent Outlaws, reacted to by nervous Officials, and justified by the sacred status of the Shrines.
  • Here we come to the nub of the problem, which the Tokyo trial refused to deal with, the role of the Shrine in whose name every single war crime was committed, Emperor Hirohito,
  • The historian Ienaga Saburo tells a story about a Japanese schoolchild in the 1930s who was squeamish about having to dissect a live frog. The teacher rapped him hard on the head with his knuckles and said: “Why are you crying about one lousy frog? When you grow up you’ll have to kill a hundred, two hundred Chinks.”
  • the lethal consequences of the emperor-worshipping system of irresponsibilities did emerge during the Tokyo trial. The savagery of Japanese troops was legitimized, if not driven, by an ideology that did not include a Final Solution but was as racialist as Hitler’s National Socialism. The Japanese were the Asian Herrenvolk, descended from the gods.
  • A veteran of the war in China said in a television interview that he was able to kill Chinese without qualms only because he didn’t regard them as human.
  • For to keep the emperor in place (he could at least have been made to resign), Hirohito’s past had to be freed from any blemish; the symbol had to be, so to speak, cleansed from what had been done in its name.
  • The same was true of the Japanese imperial institution, no matter who sat on the throne, a ruthless war criminal or a gentle marine biologist.
  • the chaplain at Sugamo prison, questioned Japanese camp commandants about their reasons for mistreating POWs. This is how he summed up their answers: “They had a belief that any enemy of the emperor could not be right, so the more brutally they treated their prisoners, the more loyal to their emperor they were being.”
  • The Mitscherlichs described Hitler as “an object on which Germans depended, to which they transferred responsibility, and he was thus an internal object. As such, he represented and revived the ideas of omnipotence that we all cherish about ourselves from infancy.
  • The fear after 1945 was that without the emperor Japan would be impossible to govern. In fact, MacArthur behaved like a traditional Japanese strongman (and was admired for doing so by many Japanese), using the imperial symbol to enhance his own power. As a result, he hurt the chances of a working Japanese democracy and seriously distorted history.
  • Aristides George Lazarus, the defense counsel of one of the generals on trial, was asked to arrange that “the military defendants, and their witnesses, would go out of their way during their testimony to include the fact that Hirohito was only a benign presence when military actions or programs were discussed at meetings that, by protocol, he had to attend.” No doubt the other counsel were given similar instructions. Only once during the trial
Javier E

Ozempic or Bust - The Atlantic - 0 views

  • June 2024 Issue
  • Explore
  • it is impossible to know, in the first few years of any novel intervention, whether its success will last.
  • ...77 more annotations...
  • The ordinary fixes—the kind that draw on people’s will, and require eating less and moving more—rarely have a large or lasting effect. Indeed, America itself has suffered through a long, maddening history of failed attempts to change its habits on a national scale: a yo-yo diet of well-intentioned treatments, policies, and other social interventions that only ever lead us back to where we started
  • Through it all, obesity rates keep going up; the diabetes epidemic keeps worsening.
  • The most recent miracle, for Barb as well as for the nation, has come in the form of injectable drugs. In early 2021, the Danish pharmaceutical company Novo Nordisk published a clinical trial showing remarkable results for semaglutide, now sold under the trade names Wegovy and Ozempic.
  • Patients in the study who’d had injections of the drug lost, on average, close to 15 percent of their body weight—more than had ever been achieved with any other drug in a study of that size. Wadden knew immediately that this would be “an incredible revolution in the treatment of obesity.”
  • Many more drugs are now racing through development: survodutide, pemvidutide, retatrutide. (Among specialists, that last one has produced the most excitement: An early trial found an average weight loss of 24 percent in one group of participants.
  • In the United States, an estimated 189 million adults are classified as having obesity or being overweight
  • The drugs don’t work for everyone. Their major side effects—nausea, vomiting, and diarrhea—can be too intense for many patients. Others don’t end up losing any weight
  • For the time being, just 25 percent of private insurers offer the relevant coverage, and the cost of treatment—about $1,000 a month—has been prohibitive for many Americans.
  • The drugs have already been approved not just for people with diabetes or obesity, but for anyone who has a BMI of more than 27 and an associated health condition, such as high blood pressure or cholesterol. By those criteria, more than 140 million American adults already qualify
  • if this story goes the way it’s gone for other “risk factor” drugs such as statins and antihypertensives, then the threshold for prescriptions will be lowered over time, inching further toward the weight range we now describe as “normal.”
  • How you view that prospect will depend on your attitudes about obesity, and your tolerance for risk
  • The first GLP-1 drug to receive FDA approval, exenatide, has been used as a diabetes treatment for more than 20 years. No long-term harms have been identified—but then again, that drug’s long-term effects have been studied carefully only across a span of seven years
  • the data so far look very good. “These are now being used, literally, in hundreds of thousands of people across the world,” she told me, and although some studies have suggested that GLP-1 drugs may cause inflammation of the pancreas, or even tumor growth, these concerns have not borne out.
  • adolescents are injecting newer versions of these drugs, and may continue to do so every week for 50 years or more. What might happen over all that time?
  • “All of us, in the back of our minds, always wonder, Will something show up?  ” Although no serious problems have yet emerged, she said, “you wonder, and you worry.”
  • in light of what we’ve been through, it’s hard to see what other choices still remain. For 40 years, we’ve tried to curb the spread of obesity and its related ailments, and for 40 years, we’ve failed. We don’t know how to fix the problem. We don’t even understand what’s really causing it. Now, again, we have a new approach. This time around, the fix had better work.
  • The fen-phen revolution arrived at a crucial turning point for Wadden’s field, and indeed for his career. By then he’d spent almost 15 years at the leading edge of research into dietary interventions, seeing how much weight a person might lose through careful cutting of their calories.
  • But that sort of diet science—and the diet culture that it helped support—had lately come into a state of ruin. Americans were fatter than they’d ever been, and they were giving up on losing weight. According to one industry group, the total number of dieters in the country declined by more than 25 percent from 1986 to 1991.
  • Rejecting diet culture became something of a feminist cause. “A growing number of women are joining in an anti-diet movement,” The New York Times reported in 1992. “They are forming support groups and ceasing to diet with a resolve similar to that of secretaries who 20 years ago stopped getting coffee for their bosses.
  • Now Wadden and other obesity researchers were reaching a consensus that behavioral interventions might produce in the very best scenario an average lasting weight loss of just 5 to 10 percent
  • National surveys completed in 1994 showed that the adult obesity rate had surged by more than half since 1980, while the proportion of children classified as overweight had doubled. The need for weight control in America had never seemed so great, even as the chances of achieving it were never perceived to be so small.
  • Wadden wasn’t terribly concerned, because no one in his study had reported any heart symptoms. But ultrasounds revealed that nearly one-third of them had some degree of leakage in their heart valves. His “cure for obesity” was in fact a source of harm.
  • In December 1994, the Times ran an editorial on what was understood to be a pivotal discovery: A genetic basis for obesity had finally been found. Researchers at Rockefeller University were investigating a molecule, later named leptin, that gets secreted from fat cells and travels to the brain, and that causes feelings of satiety. Lab mice with mutations in the leptin gene—importantly, a gene also found in humans—overeat until they’re three times the size of other mice. “The finding holds out the dazzling hope,”
  • In April 1996, the doctors recommended yes: Dexfenfluramine was approved—and became an instant blockbuster. Patients received prescriptions by the hundreds of thousands every month. Sketchy wellness clinics—call toll-free, 1-888-4FEN-FEN—helped meet demand. Then, as now, experts voiced concerns about access. Then, as now, they worried that people who didn’t really need the drugs were lining up to take them. By the end of the year, sales of “fen” alone had surpassed $300 million.
  • It was nothing less than an awakening, for doctors and their patients alike. Now a patient could be treated for excess weight in the same way they might be treated for diabetes or hypertension—with a drug they’d have to take for the rest of their life.
  • the article heralded a “new understanding of obesity as a chronic disease rather than a failure of willpower.”
  • News had just come out that, at the Mayo Clinic in Minnesota, two dozen women taking fen-phen—including six who were, like Barb, in their 30s—had developed cardiac conditions. A few had needed surgery, and on the operating table, doctors discovered that their heart valves were covered with a waxy plaque.
  • Americans had been prescribed regular fenfluramine since 1973, and the newer drug, dexfenfluramine, had been available in France since 1985. Experts took comfort in this history. Using language that is familiar from today’s assurances regarding semaglutide and other GLP-1 drugs, they pointed out that millions were already on the medication. “It is highly unlikely that there is anything significant in toxicity to the drug that hasn’t been picked up with this kind of experience,” an FDA official named James Bilstad would later say in a Time cover story headlined “The Hot New Diet Pill.
  • “I know I can’t get any more,” she told Williams. “I have to use up what I have. And then I don’t know what I’m going to do after that. That’s the problem—and that is what scares me to death.” Telling people to lose weight the “natural way,” she told another guest, who was suggesting that people with obesity need only go on low-carb diets, is like “asking a person with a thyroid condition to just stop their medication.”
  • She’d gone off the fen-phen and had rapidly regained weight. “The voices returned and came back in a furor I’d never heard before,” Barb later wrote on her blog. “It was as if they were so angry at being silenced for so long, they were going to tell me 19 months’ worth of what they wanted me to hear. I was forced to listen. And I ate. And I ate. And ate.”
  • For Barb, rapid weight loss has brought on a different metaphysical confusion. When she looks in the mirror, she sometimes sees her shape as it was two years ago. In certain corners of the internet, this is known as “phantom fat syndrome,” but Barb dislikes that term. She thinks it should be called “body integration syndrome,” stemming from a disconnect between your “larger-body memory” and “smaller-body reality.
  • In 2003, the U.S. surgeon general declared obesity “the terror within, a threat that is every bit as real to America as the weapons of mass destruction”; a few months later, Eric Finkelstein, an economist who studies the social costs of obesity, put out an influential paper finding that excess weight was associated with up to $79 billion in health-care spending in 1998, of which roughly half was paid by Medicare and Medicaid. (Later he’d conclude that the number had nearly doubled in a decade.
  • In 2004, Finkelstein attended an Action on Obesity summit hosted by the Mayo Clinic, at which numerous social interventions were proposed, including calorie labeling in workplace cafeterias and mandatory gym class for children of all grades.
  • he message at their core, that soda was a form of poison like tobacco, spread. In San Francisco and New York, public-service campaigns showed images of soda bottles pouring out a stream of glistening, blood-streaked fat. Michelle Obama led an effort to depict water—plain old water—as something “cool” to drink.
  • Soon, the federal government took up many of the ideas that Brownell had helped popularize. Barack Obama had promised while campaigning for president that if America’s obesity trends could be reversed, the Medicare system alone would save “a trillion dollars.” By fighting fat, he implied, his ambitious plan for health-care reform would pay for itself. Once he was in office, his administration pulled every policy lever it could.
  • Michelle Obama helped guide these efforts, working with marketing experts to develop ways of nudging kids toward better diets and pledging to eliminate “food deserts,” or neighborhoods that lacked convenient access to healthy, affordable food. She was relentless in her public messaging; she planted an organic garden at the White House and promoted her signature “Let’s Move!” campaign around the country.
  • An all-out war on soda would come to stand in for these broad efforts. Nutrition studies found that half of all Americans were drinking sugar-sweetened beverages every day, and that consumption of these accounted for one-third of the added sugar in adults’ diets. Studies turned up links between people’s soft-drink consumption and their risks for type 2 diabetes and obesity. A new strand of research hinted that “liquid calories” in particular were dangerous to health.
  • when their field lost faith in low-calorie diets as a source of lasting weight loss, the two friends went in opposite directions. Wadden looked for ways to fix a person’s chemistry, so he turned to pharmaceuticals. Brownell had come to see obesity as a product of our toxic food environment: He meant to fix the world to which a person’s chemistry responded, so he started getting into policy.
  • The social engineering worked. Slowly but surely, Americans’ lamented lifestyle began to shift. From 2001 to 2018, added-sugar intake dropped by about one-fifth among children, teens, and young adults. From the late 1970s through the early 2000s, the obesity rate among American children had roughly tripled; then, suddenly, it flattened out.
  • although the obesity rate among adults was still increasing, its climb seemed slower than before. Americans’ long-standing tendency to eat ever-bigger portions also seemed to be abating.
  • sugary drinks—liquid candy, pretty much—were always going to be a soft target for the nanny state. Fixing the food environment in deeper ways proved much harder. “The tobacco playbook pretty much only works for soda, because that’s the closest analogy we have as a food item,
  • that tobacco playbook doesn’t work to increase consumption of fruits and vegetables, he said. It doesn’t work to increase consumption of beans. It doesn’t work to make people eat more nuts or seeds or extra-virgin olive oil.
  • Careful research in the past decade has shown that many of the Obama-era social fixes did little to alter behavior or improve our health. Putting calorie labels on menus seemed to prompt at most a small decline in the amount of food people ate. Employer-based wellness programs (which are still offered by 80 percent of large companies) were shown to have zero tangible effects. Health-care spending, in general, kept going up.
  • From the mid-1990s to the mid-2000s, the proportion of adults who said they’d experienced discrimination on account of their height or weight increased by two-thirds, going up to 12 percent. Puhl and others started citing evidence that this form of discrimination wasn’t merely a source of psychic harm, but also of obesity itself. Studies found that the experience of weight discrimination is associated with overeating, and with the risk of weight gain over time.
  • obesity rates resumed their ascent. Today, 20 percent of American children have obesity. For all the policy nudges and the sensible revisions to nutrition standards, food companies remain as unfettered as they were in the 1990s, Kelly Brownell told me. “Is there anything the industry can’t do now that it was doing then?” he asked. “The answer really is no. And so we have a very predictable set of outcomes.”
  • she started to rebound. The openings into her gastric pouch—the section of her stomach that wasn’t bypassed—stretched back to something like their former size. And Barb found ways to “eat around” the surgery, as doctors say, by taking food throughout the day in smaller portions
  • Bariatric surgeries can be highly effective for some people and nearly useless for others. Long-term studies have found that 30 percent of those who receive the same procedure Barb did regain at least one-quarter of what they lost within two years of reaching their weight nadir; more than half regain that much within five years.
  • if the effects of Barb’s surgery were quickly wearing off, its side effects were not: She now had iron, calcium, and B12 deficiencies resulting from the changes to her gut. She looked into getting a revision of the surgery—a redo, more or less—but insurance wouldn’t cover it
  • She found that every health concern she brought to doctors might be taken as a referendum, in some way, on her body size. “If I stubbed my toe or whatever, they’d just say ‘Lose weight.’ ” She began to notice all the times she’d be in a waiting room and find that every chair had arms. She realized that if she was having a surgical procedure, she’d need to buy herself a plus-size gown—or else submit to being covered with a bedsheet when the nurses realized that nothing else would fit.
  • Barb grew angrier and more direct about her needs—You’ll have to find me a different chair, she started saying to receptionists. Many others shared her rage. Activists had long decried the cruel treatment of people with obesity: The National Association to Advance Fat Acceptance had existed, for example, in one form or another, since 1969; the Council on Size & Weight Discrimination had been incorporated in 1991. But in the early 2000s, the ideas behind this movement began to wend their way deeper into academia, and they soon gained some purchase with the public.
  • “Our public-health efforts to address obesity have failed,” Eric Finkelstein, the economist, told me.
  • Others attacked the very premise of a “healthy weight”: People do not have any fundamental need, they argued, morally or medically, to strive for smaller bodies as an end in itself. They called for resistance to the ideology of anti-fatness, with its profit-making arms in health care and consumer goods. The Association for Size Diversity and Health formed in 2003; a year later, dozens of scholars working on weight-related topics joined together to create the academic field of fat studies.
  • As the size-diversity movement grew, its values were taken up—or co-opted—by Big Business. Dove had recently launched its “Campaign for Real Beauty,” which included plus-size women. (Ad Age later named it the best ad campaign of the 21st century.) People started talking about “fat shaming” as something to avoid
  • By 2001, Bacon, who uses they/them pronouns, had received their Ph.D. and finished a rough draft of a book, Health at Every Size, which drew inspiration from a broader movement by that name among health-care practitioners
  • But something shifted in the ensuing years. In 2007, Bacon got a different response, and the book was published. Health at Every Size became a point of entry for a generation of young activists and, for a time, helped shape Americans’ understanding of obesity.
  • Some experts were rethinking their advice on food and diet. At UC Davis, a physiologist named Lindo Bacon who had struggled to overcome an eating disorder had been studying the effects of “intuitive eating,” which aims to promote healthy, sustainable behavior without fixating on what you weigh or how you look
  • The heightened sensitivity started showing up in survey data, too. In 2010, fewer than half of U.S. adults expressed support for giving people with obesity the same legal protections from discrimination offered to people with disabilities. In 2015, that rate had risen to three-quarters.
  • In Bacon’s view, the 2000s and 2010s were glory years. “People came together and they realized that they’re not alone, and they can start to be critical of the ideas that they’ve been taught,” Bacon told me. “We were on this marvelous path of gaining more credibility for the whole Health at Every Size movement, and more awareness.”
  • that sense of unity proved short-lived; the movement soon began to splinter. Black women have the highest rates of obesity, and disproportionately high rates of associated health conditions. Yet according to Fatima Cody Stanford, an obesity-medicine physician at Harvard Medical School, Black patients with obesity get lower-quality care than white patients with obesity.
  • That system was exactly what Bacon and the Health at Every Size movement had set out to reform. The problem, as they saw it, was not so much that Black people lacked access to obesity medicine, but that, as Bacon and the Black sociologist Sabrina Strings argued in a 2020 article, Black women have been “specifically targeted” for weight loss, which Bacon and Strings saw as a form of racism
  • But members of the fat-acceptance movement pointed out that their own most visible leaders, including Bacon, were overwhelmingly white. “White female dietitians have helped steal and monetize the body positive movement,” Marquisele Mercedes, a Black activist and public-health Ph.D. student, wrote in September 2020. “And I’m sick of it.”
  • Tensions over who had the standing to speak, and on which topics, boiled over. In 2022, following allegations that Bacon had been exploitative and condescending toward Black colleagues, the Association for Size Diversity and Health expelled them from its ranks and barred them from attending its events.
  • As the movement succumbed to in-fighting, its momentum with the public stalled. If attitudes about fatness among the general public had changed during the 2000s and 2010s, it was only to a point. The idea that some people can indeed be “fit but fat,” though backed up by research, has always been a tough sell.
  • Although Americans had become less inclined to say they valued thinness, measures of their implicit attitudes seemed fairly stable. Outside of a few cities such as San Francisco and Madison, Wisconsin, new body-size-discrimination laws were never passed.
  • In the meantime, thinness was coming back into fashion
  • In the spring of 2022, Kim Kardashian—whose “curvy” physique has been a media and popular obsession—boasted about crash-dieting in advance of the Met Gala. A year later, the model and influencer Felicity Hayward warned Vogue Business that “plus-size representation has gone backwards.” In March of this year, the singer Lizzo, whose body pride has long been central to her public persona, told The New York Times that she’s been trying to lose weight. “I’m not going to lie and say I love my body every day,” she said.
  • Among the many other dramatic effects of the GLP-1 drugs, they may well have released a store of pent-up social pressure to lose weight.
  • If ever there was a time to debate that impulse, and to question its origins and effects, it would be now. But Puhl told me that no one can even agree on which words are inoffensive. The medical field still uses obesity, as a description of a diagnosable disease. But many activists despise that phrase—some spell it with an asterisk in place of the e—and propose instead to reclaim fat.
  • Everyone seems to agree on the most important, central fact: that we should be doing everything we can to limit weight stigma. But that hasn’t been enough to stop the arguing.
  • Things feel surreal these days to just about anyone who has spent years thinking about obesity. At 71, after more than four decades in the field, Thomas Wadden now works part-time, seeing patients just a few days a week. But the arrival of the GLP-1 drugs has kept him hanging on for a few more years, he said. “It’s too much of an exciting period to leave obesity research right now.”
  • When everyone is on semaglutide or tirzepatide, will the soft-drink companies—Brownell’s nemeses for so many years—feel as if a burden has been lifted? “My guess is the food industry is probably really happy to see these drugs come along,” he said. They’ll find a way to reach the people who are taking GLP‑1s, with foods and beverages in smaller portions, maybe. At the same time, the pressures to cut back on where and how they sell their products will abate.
  • the triumph in obesity treatment only highlights the abiding mystery of why Americans are still getting fatter, even now
  • Perhaps one can lay the blame on “ultraprocessed” foods, he said. Maybe it’s a related problem with our microbiomes. Or it could be that obesity, once it takes hold within a population, tends to reproduce itself through interactions between a mother and a fetus. Others have pointed to increasing screen time, how much sleep we get, which chemicals are in the products that we use, and which pills we happen to take for our many other maladies.
  • “The GLP-1s are just a perfect example of how poorly we understand obesity,” Mozaffarian told me. “Any explanation of why they cause weight loss is all post-hoc hand-waving now, because we have no idea. We have no idea why they really work and people are losing weight.”
  • The new drugs—and the “new understanding of obesity” that they have supposedly occasioned—could end up changing people’s attitudes toward body size. But in what ways
  • When the American Medical Association declared obesity a disease in 2013, Rebecca Puhl told me, some thought “it might reduce stigma, because it was putting more emphasis on the uncontrollable factors that contribute to obesity.” Others guessed that it would do the opposite, because no one likes to be “diseased.”
  • why wasn’t there another kind of nagging voice that wouldn’t stop—a sense of worry over what the future holds? And if she wasn’t worried for herself, then what about for Meghann or for Tristan, who are barely in their 40s? Wouldn’t they be on these drugs for another 40 years, or even longer? But Barb said she wasn’t worried—not at all. “The technology is so much better now.” If any problems come up, the scientists will find solutions.
Javier E

Uber Eats Is Killing the Sociable Restaurant Experience - The Atlantic - 0 views

  • Like so many in 18th-century Europe, the self-styled inventor of restaurants, Mathurin Roze de Chantoiseau, believed that improving circulation—of food through people’s bodies or of money, goods, and information through society—would bring benefits to all
  • restaurants also promised new levels of personal service. Separate tables, flexible mealtimes, and menus distinguished dinner in a restaurant from the more collective, communitarian experience of an innkeeper’s or caterer’s table d’hôte.
  • By promising to restore circulation and facilitate personal well-being, restaurants offered both pleasure and profit: profit for the restaurateur, to be sure, but also for the individual customer and, by extension, for the public at large. Public benefits could come from private appetites.
  • ...5 more annotations...
  • In his essay “On Refinement in the Arts,” the philosopher and historian David Hume traced a similar logic, positing that improvements in production (what we today call the Industrial Revolution) and ideas (the Enlightenment) would necessarily spur greater sociability. What was good for one was good for all. “The more these refined arts advance,” Hume wrote, “the more sociable men become … enriched with science, and possessed of a fund of conversation … both sexes meet in an easy and sociable manner; and the tempers of men, as well as their behaviour, refine apace … Thus industry, knowledge, and humanity are linked together by an indissoluble chain.”
  • If brick-and-mortar restaurants become mere storefronts for delivery services, they will cease to be public spaces in any sense of the term. When dinner from a restaurant replaces dinner in a restaurant, we lose track of all the other people who are dining as well
  • this hyper-individualization of consumption may bring a new political revolution as well
  • The way we shop and eat now forms a feedback loop with the general discrediting of the idea of “public good”—and, with it, of public spaces and shared civility.
  • Cell phones, charter schools, the rhetoric of “taxpayer” dollars (as if the money, once paid, still belonged to those who paid it): all make for a political climate and lived reality where very little that is “public,” in the sense of shared and common, remains.
aidenborst

Opinion: To recharge the country, Biden should give 1 million college students paid gov... - 0 views

  • There are 16 million college students in America. These young people hold the promise of being our next generation of innovators and leaders. But once they graduate, they face student debt repayments and a labor market that has been ravaged by the pandemic.
  • Over the next four years, our country can give 1 million college students a paid internship in federal, state or municipal government, plus a chance to compete for a college scholarship in exchange for their service.
  • Pew polling shows trust in government at near-record lows, and interest in public service is also eroding. America's poor response to the pandemic and concerns about the abuse of executive power have driven many young prospects away.
  • ...9 more annotations...
  • Analysis of government data shows that just 6.7% of federal employees are under the age of 30 — compared with 20.6% of those in the employed labor force. That's even more problematic when you consider that about one-third of federal workers will be eligible to retire in the next five years.
  • We need talented young folks in government, not only to replace those coming to the end of their careers of service, but also to bring new skills that will help the country confront a wide array of public challenges and opportunities.
  • The administration should greatly expand its efforts to encourage talented college students to apply for paid internships at government entities.
  • The nonprofit Partnership for Public Service would provide initial staff and infrastructure to help facilitate the recruitment, selection and matching operations for the federal component to ensure that participants are placed in internships.
  • For the duration of the administration, the government could arrange about 20,000 internships a month at a time — for example, 400 dedicated to federal, state or city government in each state if distributed evenly.
  • Second, the program could be expanded through competition
  • Third, given their prior experience, participants in the internship program could have fast-tracked access to compelling projects upon completion of their college degrees
  • The program could include a mentorship component, pairing experienced leaders with prospective new talent, aimed at guiding participants in their work and encouraging and helping them pursue a career in public service
  • Challenges like protecting the population against pandemics and unwinding centuries of racial injustice will not be solved in the span of a single presidency, or even within a single generation. To tackle these and other challenges, the American government — and its ability to attract innovative talent — has to be built for the long haul. And we have to start building it now.
1 - 20 of 516 Next › Last »
Showing 20 items per page