Skip to main content

Home/ History Readings/ Group items tagged public

Rss Feed Group items tagged

Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

How Public Health Took Part in Its Own Downfall - The Atlantic - 0 views

  • when the coronavirus pandemic reached the United States, it found a public-health system in disrepair. That system, with its overstretched staff, meager budgets, crumbling buildings, and archaic equipment, could barely cope with sickness as usual, let alone with a new, fast-spreading virus.
  • By one telling, public health was a victim of its own success, its value shrouded by the complacency of good health
  • By a different account, the competing field of medicine actively suppressed public health, which threatened the financial model of treating illness in (insured) individuals
  • ...27 more annotations...
  • In fact, “public health has actively participated in its own marginalization,” Daniel Goldberg, a historian of medicine at the University of Colorado, told me. As the 20th century progressed, the field moved away from the idea that social reforms were a necessary part of preventing disease and willingly silenced its own political voice. By swimming along with the changing currents of American ideology, it drowned many of the qualities that made it most effective.
  • Germ theory offered a seductive new vision for defeating disease: Although the old public health “sought the sources of infectious disease in the surroundings of man; the new finds them in man himself,” wrote Hibbert Hill in The New Public Health in 1913
  • “They didn’t have to think of themselves as activists,” Rosner said. “It was so much easier to identify individual victims of disease and cure them than it was to rebuild a city.”
  • As public health moved into the laboratory, a narrow set of professionals associated with new academic schools began to dominate the once-broad field. “It was a way of consolidating power: If you don’t have a degree in public health, you’re not public health,”
  • Mastering the new science of bacteriology “became an ideological marker,” sharply differentiating an old generation of amateurs from a new one of scientifically minded professionals,
  • Hospitals, meanwhile, were becoming the centerpieces of American health care, and medicine was quickly amassing money and prestige by reorienting toward biomedical research
  • Public health began to self-identify as a field of objective, outside observers of society instead of agents of social change. It assumed a narrower set of responsibilities that included data collection, diagnostic services for clinicians, disease tracing, and health education.
  • Assuming that its science could speak for itself, the field pulled away from allies such as labor unions, housing reformers, and social-welfare organizations that had supported city-scale sanitation projects, workplace reforms, and other ambitious public-health projects.
  • That left public health in a precarious position—still in medicine’s shadow, but without the political base “that had been the source of its power,”
  • After World War II, biomedicine lived up to its promise, and American ideology turned strongly toward individualism.
  • Seeing poor health as a matter of personal irresponsibility rather than of societal rot became natural.
  • Even public health began to treat people as if they lived in a social vacuum. Epidemiologists now searched for “risk factors,” such as inactivity and alcohol consumption, that made individuals more vulnerable to disease and designed health-promotion campaigns that exhorted people to change their behaviors, tying health to willpower in a way that persists today.
  • Public health is now trapped in an unenviable bind. “If it conceives of itself too narrowly, it will be accused of lacking vision … If it conceives of itself too expansively, it will be accused of overreaching,
  • “epidemiology isn’t a field of activists saying, ‘God, asbestos is terrible,’ but of scientists calculating the statistical probability of someone’s death being due to this exposure or that one.”
  • In 1971, Paul Cornely, then the president of the APHA and the first Black American to earn a Ph.D. in public health, said that “if the health organizations of this country have any concern about the quality of life of its citizens, they would come out of their sterile and scientific atmosphere and jump in the polluted waters of the real world where action is the basis for survival.”
  • a new wave of “social epidemiologists” once again turned their attention to racism, poverty, and other structural problems.
  • The biomedical view of health still dominates, as evidenced by the Biden administration’s focus on vaccines at the expense of masks, rapid tests, and other “nonpharmaceutical interventions.”
  • Public health has often been represented by leaders with backgrounds primarily in clinical medicine, who have repeatedly cast the pandemic in individualist terms: “Your health is in your own hands,” said the CDC’s director, Rochelle Walensky, in May
  • the pandemic has proved what public health’s practitioners understood well in the late 19th and early 20th century: how important the social side of health is. People can’t isolate themselves if they work low-income jobs with no paid sick leave, or if they live in crowded housing or prisons.
  • This approach appealed, too, to powerful industries with an interest in highlighting individual failings rather than the dangers of their products.
  • “Public health gains credibility from its adherence to science, and if it strays too far into political advocacy, it may lose the appearance of objectivity,”
  • In truth, public health is inescapably political, not least because it “has to make decisions in the face of rapidly evolving and contested evidence,” Fairchild told me. That evidence almost never speaks for itself, which means the decisions that arise from it must be grounded in values.
  • Those values, Fairchild said, should include equity and the prevention of harm to others, “but in our history, we lost the ability to claim these ethical principles.”
  • “Sick-leave policies, health-insurance coverage, the importance of housing … these things are outside the ability of public health to implement, but we should raise our voices about them,” said Mary Bassett, of Harvard, who was recently appointed as New York’s health commissioner. “I think we can get explicit.”
  • The future might lie in reviving the past, and reopening the umbrella of public health to encompass people without a formal degree or a job at a health department.
  • What if, instead, we thought of the Black Lives Matter movement as a public-health movement, the American Rescue Plan as a public-health bill, or decarceration, as the APHA recently stated, as a public-health goal? In this way of thinking, too, employers who institute policies that protect the health of their workers are themselves public-health advocates.
  • “We need to re-create alliances with others and help them to understand that what they are doing is public health,
clairemann

AOC and Rashida Tlaib's Public Banking Act, explained - Vox - 0 views

  • A public option, but for banking. That’s what Reps. Rashida Tlaib and Alexandria Ocasio-Cortez are proposing in a new bill unveiled on Friday.
  • would foster the creation of public banks across the country by providing them a pathway to getting started, establishing an infrastructure for liquidity and credit facilities for them via the Federal Reserve, and setting up federal guidelines for them to be regulated.
  • at some point it’s just hitting a wall where it doesn’t carry them along and they’re looking for options,” said Tlaib, who represents Michigan’s 13th Congressional District, the third-poorest congressional district in the country. “So I’m putting this on the table as an option.”
  • ...28 more annotations...
  • The proposal lands in the midst of the Covid-19 pandemic, which has shed light on many inefficiencies in the American system, including banking. Take the Paycheck Protection Program, for example: It used the regular banking system as an intermediary, which ultimately meant that bigger businesses and those with preexisting relationships with those banks were prioritized over others.
  • guarantee a more equitable recovery by providing an alternative to Wall Street banks for state and local governments, businesses, and ordinary people,
  • The public banking bill also does double duty as a climate bill: It would prohibit public banks from investing in or doing business with the fossil fuel industry.
  • “Public banks empower states and municipalities to establish new channels of public investment to help solve systemic crises.”
  • But, he said, this proposal is particularly comprehensive and supportive.
  • If Democrats keep control of the House come 2021 and manage to flip the Senate and win the White House, they’ll be able to take some big legislative swings, including and perhaps especially on issues related to the economy.
  • which theoretically would be more motivated to do public good and invest in their communities than private institutions, which are out for profit.
  • To be clear, the Public Banking Act isn’t creating a federal public bank.
  • encourage and enable the creation of public banks across the US. It provides legitimacy to those who are pushing for more public banking, and it also includes regulators as key stakeholders who can support and provide guidance for how those banks should operate.
  • though different public banks would likely have different areas of emphasis.
  • They could also facilitate easier access to funds for state and local governments from the federal government or Federal Reserve.
  • “It’s basically a way to finance state and local investment that doesn’t go through Wall Street and doesn’t leave the community and turn into a windfall for shareholders,
  • Public banks need the FDIC to provide assurances that it will recognize them in accordance with the bond rating of the city or state they represent.
  • Tlaib recalled hearing from her constituents when the $1,200 coronavirus stimulus checks went out this spring — people waiting days and weeks for direct deposits, or getting a check in the mail only to lose a substantial portion of it cashing it at the store down the street.
  • The Public Banking Act allows the Federal Reserve to charter and grant membership to public banks and creates a grant program for the Treasury secretary to provide seed money for public banks to be formed, capitalized, and developed.
  • “This is more about community development.”
  • McConnell said the FDIC issuing guidance that it recognizes the city’s — and the state’s — public banks as an AAA rating would send a clear direction to the state financial regulators that the public bank is considered low risk.
  • The bill would also provide a road map for the FDIC, which insures bank deposits of up to $250,000, to insure deposits for public banks, so people feel assured they won’t lose all their money by choosing to open an account with their state bank instead of, say, Wells Fargo.
  • the Office of the Comptroller of the Currency (OCC) has historically been charged with chartering national banks in the US, not the Fed, meaning this is a fairly novel idea.
  • It prohibits the Fed and Treasury from considering the financial health of an entity that controls or owns a bank in grant-making decisions.
  • So here is the thing about private companies, including, yes, banks: The point of them is to make money, and that drives their decisions. It’s not necessarily evil (though sometimes it kind of is), but it’s just how they work.
  • The idea behind public banking isn’t that Goldman Sachs, Wells Fargo, and Morgan Stanley go away; it’s that they have to compete with a government-owned entity — and one that’s a little fairer and more ethical in how it does business.
  • Public banks, as imagined in the Tlaib/Ocasio-Cortez proposal, would provide loans to small businesses and governments with lower interest rates and lower fees.
  • Student loans are facilitated directly with BND, but other loans, called participation loans, go through a local financial institution — often with BND support.
  • According to a study on public banks, BND had some $2 billion in active participation loans in 2014. BND can grant larger loans at a lower risk, which fosters a healthy financial ecosystem populated by a cluster of small North Dakota banks.
  • Democrats have a lot of ideas, and if they take power come January 2021, there’s a lot they can do.
  • The Public Banking Act is meant to complement ideas such as the ABC Act and postal banking. And, of course, it’s linked to the Green New Deal, not only because it would bar public banks from financing things that hurt the environment, but also because the idea is that public banks would play a major role in financing Green New Deal and climate-friendly projects.
  • If former Vice President Joe Biden wins the White House and Democrats control both the House and the Senate come 2021, the talk around these ideas becomes a lot more serious.
Javier E

The nation's public health agencies are ailing when they're needed most - The Washingto... - 0 views

  • At the very moment the United States needed its public health infrastructure the most, many local health departments had all but crumbled, proving ill-equipped to carry out basic functions let alone serve as the last line of defense against the most acute threat to the nation’s health in generations.
  • Epidemiologists, academics and local health officials across the country say the nation’s public health system is one of many weaknesses that continue to leave the United States poorly prepared to handle the coronavirus pandemic
  • That system lacks financial resources. It is losing staff by the day.
  • ...31 more annotations...
  • Even before the pandemic struck, local public health agencies had lost almost a quarter of their overall workforce since 2008 — a reduction of almost 60,000 workers
  • The agencies’ main source of federal funding — the Centers for Disease Control and Prevention’s emergency preparedness budget — had been cut 30 percent since 2003. The Trump administration had proposed slicing even deeper.
  • According to David Himmelstein of the CUNY School of Public Health, global consensus is that, at minimum, 6 percent of a nation’s health spending should be devoted to public health efforts. The United States, he said, has never spent more than half that much.
  • the problems have been left to fester.
  • Delaware County, Pa., a heavily populated Philadelphia suburb, did not even have a public health department when the pandemic struck and had to rely on a neighbor to mount a response.
  • With plunging tax receipts straining local government budgets, public health agencies confront the possibility of further cuts in an economy gutted by the coronavirus. It is happening at a time when health departments are being asked to do more than ever.
  • While the country spends roughly $3.6 trillion every year on health, less than 3 percent of that spending goes to public health and prevention
  • “Why an ongoing government function should depend on episodic grants rather than consistent funding, I don’t know,” he added. “That would be like seeing that the military is going to apply for a grant for its regular ongoing activities.”
  • Compared with Canada, the United Kingdom and northern European countries, the United States — with a less generous social safety net and no universal health care — is investing less in a system that its people rely on more.
  • Himmelstein said that the United States has never placed much emphasis on public health spending but that the investment began to decline even further in the early 2000s. The Great Recession fueled further cuts.
  • Plus, the U.S. public health system relies heavily on federal grants.
  • “That’s the way we run much of our public health activity for local health departments. You apply to the CDC, which is the major conduit for federal funding to state and local health departments,” Himmelstein said. “You apply to them for funding for particular functions, and if you don’t get the grant, you don’t have the funding for that.”
  • Many public health officials say a lack of a national message and approach to the pandemic has undermined their credibility and opened them up to criticism.
  • Few places were less prepared for covid-19’s arrival than Delaware County, Pa., where Republican leaders had decided they did not need a public health department at all
  • At the same time, many countries that invest more in public health infrastructure also provide universal medical coverage that enables them to provide many common public health services as part of their main health-care-delivery system.
  • Taylor and other elected officials worked out a deal with neighboring Chester County in which Delaware County paid affluent Chester County’s health department to handle coronavirus operations for both counties for now.
  • One reason health departments are so often neglected is their work focuses on prevention — of outbreaks, sexually transmitted diseases, smoking-related illnesses. Local health departments describe a frustrating cycle: The more successful they are, the less visible problems are and the less funding they receive. Often, that sets the stage for problems to explode again — as infectious diseases often do.
  • It has taken years for many agencies to rebuild budgets and staffing from deep cuts made during the last recessio
  • During the past decade, many local health departments have seen annual rounds of cuts, punctuated with one-time infusions of money following crises such as outbreaks of Zika, Ebola, measles and hepatitis. The problem with that cycle of feast or famine funding is that the short-term money quickly dries up and does nothing to address long-term preparedness.
  • “It’s a silly strategic approach when you think about what’s needed to protect us long term,”
  • She compared the country’s public health system to a house with deep cracks in the foundation. The emergency surges of funding are superficial repairs that leave those cracks unaddressed.
  • “We came into this pandemic at a severe deficit and are still without a strategic goal to build back that infrastructure. We need to learn from our mistakes,”
  • With the economy tanking, the tax bases for cities and counties have shrunken dramatically — payroll taxes, sales taxes, city taxes. Many departments have started cutting staff. Federal grants are no sure thing.
  • 80 percent of counties have reported their budget was affected in the current fiscal year because of the crisis. Prospects are even more dire for future budget periods, when the full impact of reduced tax revenue will become evident.
  • Christine Hahn, medical director for Idaho’s division of public health and a 25-year public health veteran, has seen the state make progress in coronavirus testing and awareness. But like so many public health officials across the country taking local steps to deal with what has become a national problem, she is limited by how much government leaders say she can do and by what citizens are willing to do.
  • “I’ve been through SARS, the 2009 pandemic, the anthrax attacks, and of course I’m in rural Idaho, not New York City and California,” Hahn said. “But I will say this is way beyond anything I’ve ever experienced as far as stress, workload, complexity, frustration, media and public interest, individual citizens really feeling very strongly about what we’re doing and not doing.”
  • “I think the general population didn’t really realize we didn’t have a health department. They just kind of assumed that was one of those government agencies we had,” Taylor said. “Then the pandemic hit, and everyone was like, ‘Wait, hold on — we don’t have a health department? Why don’t we have a health department?’ ”
  • “People locally are looking to see what’s happening in other states, and we’re constantly having to talk about that and address that,”
  • “I’m mindful of the credibility of our messaging as people say, ‘What about what they’re doing in this place? Why are we not doing what they’re doing?’ ”
  • Many health experts worry the challenges will multiply in the fall with the arrival of flu season.
  • “The unfolding tragedy here is we need people to see local public health officials as heroes in the same way that we laud heart surgeons and emergency room doctors,” Westergaard, the Wisconsin epidemiologist, said. “The work keeps getting higher, and they’re falling behind — and not feeling appreciated by their communities.”
Javier E

Losing Earth: The Decade We Almost Stopped Climate Change - The New York Times - 0 views

  • As Malcolm Forbes Baldwin, the acting chairman of the president’s Council for Environmental Quality, told industry executives in 1981, “There can be no more important or conservative concern than the protection of the globe itself.”
  • Among those who called for urgent, immediate and far-reaching climate policy were Senators John Chafee, Robert Stafford and David Durenberger; the E.P.A. administrator, William K. Reilly; and, during his campaign for president, George H.W. Bush.
  • It was understood that action would have to come immediately. At the start of the 1980s, scientists within the federal government predicted that conclusive evidence of warming would appear on the global temperature record by the end of the decade, at which point it would be too late to avoid disaster.
  • ...180 more annotations...
  • If the world had adopted the proposal widely endorsed at the end of the ’80s — a freezing of carbon emissions, with a reduction of 20 percent by 2005 — warming could have been held to less than 1.5 degrees.
  • Action had to be taken, and the United States would need to lead. It didn’t.
  • There can be no understanding of our current and future predicament without understanding why we failed to solve this problem when we had the chance.
  • The first suggestion to Rafe Pomerance that humankind was destroying the conditions necessary for its own survival came on Page 66 of the government publication EPA-600/7-78-019. It was a technical report about coal
  • ‘This Is the Whole Banana’ Spring 1979
  • here was an urgent problem that demanded their attention, MacDonald believed, because human civilization faced an existential crisis. In “How to Wreck the Environment,” a 1968 essay published while he was a science adviser to Lyndon Johnson, MacDonald predicted a near future in which “nuclear weapons were effectively banned and the weapons of mass destruction were those of environmental catastrophe.” One of the most potentially devastating such weapons, he believed, was the gas that we exhaled with every breath: carbon dioxide. By vastly increasing carbon emissions, the world’s most advanced militaries could alter weather patterns and wreak famine, drought and economic collapse.
  • the Jasons. They were like one of those teams of superheroes with complementary powers that join forces in times of galactic crisis. They had been brought together by federal agencies, including the C.I.A, to devise scientific solutions to national-security problems: how to detect an incoming missile; how to predict fallout from a nuclear bomb; how to develop unconventional weapons, like plague-infested rats.
  • Agle pointed to an article about a prominent geophysicist named Gordon MacDonald, who was conducting a study on climate change with the Jasons, the mysterious coterie of elite scientists to which he belonged
  • During the spring of 1977 and the summer of 1978, the Jasons met to determine what would happen once the concentration of carbon dioxide in the atmosphere doubled from pre-Industrial Revolution levels. It was an arbitrary milestone, the doubling, but a useful one, as its inevitability was not in question; the threshold would most likely be breached by 2035.
  • The Jasons’ report to the Department of Energy, “The Long-Term Impact of Atmospheric Carbon Dioxide on Climate,” was written in an understated tone that only enhanced its nightmarish findings: Global temperatures would increase by an average of two to three degrees Celsius; Dust Bowl conditions would “threaten large areas of North America, Asia and Africa”; access to drinking water and agricultural production would fall, triggering mass migration on an unprecedented scale. “Perhaps the most ominous feature,” however, was the effect of a changing climate on the poles. Even a minimal warming “could lead to rapid melting” of the West Antarctic ice sheet. The ice sheet contained enough water to raise the level of the oceans 16 feet.
  • MacDonald explained that he first studied the carbon-dioxide issue when he was about Pomerance’s age — in 1961, when he served as an adviser to John F. Kennedy. Pomerance pieced together that MacDonald, in his youth, had been something of a prodigy: In his 20s, he advised Dwight D. Eisenhower on space exploration; at 32, he became a member of the National Academy of Sciences; at 40, he was appointed to the inaugural Council on Environmental Quality, where he advised Richard Nixon on the environmental dangers of burning coal. He monitored the carbon-dioxide problem the whole time, with increasing alarm.
  • They were surprised to learn how few senior officials were familiar with the Jasons’ findings, let alone understood the ramifications of global warming. At last, having worked their way up the federal hierarchy, the two went to see the president’s top scientist, Frank Press.
  • Thus began the Gordon and Rafe carbon-dioxide roadshow. Beginning in the spring of 1979, Pomerance arranged informal briefings with the E.P.A., the National Security Council, The New York Times, the Council on Environmental Quality and the Energy Department, which, Pomerance learned, had established an Office of Carbon Dioxide Effects two years earlier at MacDonald’s urging
  • . Out of respect for MacDonald, Press had summoned to their meeting what seemed to be the entire senior staff of the president’s Office of Science and Technology Policy — the officials consulted on every critical matter of energy and national security. What Pomerance had expected to be yet another casual briefing assumed the character of a high-level national-security meeting.
  • MacDonald would begin his presentation by going back more than a century to John Tyndall — an Irish physicist who was an early champion of Charles Darwin’s work and died after being accidentally poisoned by his wife. In 1859, Tyndall found that carbon dioxide absorbed heat and that variations in the composition of the atmosphere could create changes in climate. These findings inspired Svante Arrhenius, a Swedish chemist and future Nobel laureate, to deduce in 1896 that the combustion of coal and petroleum could raise global temperatures. This warming would become noticeable in a few centuries, Arrhenius calculated, or sooner if consumption of fossil fuels continued to increase.
  • Four decades later, a British steam engineer named Guy Stewart Callendar discovered that, at the weather stations he observed, the previous five years were the hottest in recorded history. Humankind, he wrote in a paper, had become “able to speed up the processes of Nature.” That was in 1939.
  • MacDonald’s history concluded with Roger Revelle, perhaps the most distinguished of the priestly caste of government scientists who, since the Manhattan Project, advised every president on major policy; he had been a close colleague of MacDonald and Press since they served together under Kennedy. In a 1957 paper written with Hans Suess, Revelle concluded that “human beings are now carrying out a large-scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future.” Revelle helped the Weather Bureau establish a continuous measurement of atmospheric carbon dioxide at a site perched near the summit of Mauna Loa on the Big Island of Hawaii, 11,500 feet above the sea — a rare pristine natural laboratory on a planet blanketed by fossil-fuel emissions.
  • After nearly a decade of observation, Revelle had shared his concerns with Lyndon Johnson, who included them in a special message to Congress two weeks after his inauguration. Johnson explained that his generation had “altered the composition of the atmosphere on a global scale” through the burning of fossil fuels, and his administration commissioned a study of the subject by his Science Advisory Committee. Revelle was its chairman, and its 1965 executive report on carbon dioxide warned of the rapid melting of Antarctica, rising seas, increased acidity of fresh waters — changes that would require no less than a coordinated global effort to forestall.Yet emissions continued to rise, and at this rate, MacDonald warned, they could see a snowless New England, the swamping of major coastal cities, as much as a 40 percent decline in national wheat production, the forced migration of about one-quarter of the world’s population. Not within centuries — within their own lifetimes.
  • On May 22, Press wrote a letter to the president of the National Academy of Sciences requesting a full assessment of the carbon-dioxide issue. Jule Charney, the father of modern meteorology, would gather the nation’s top oceanographers, atmospheric scientists and climate modelers to judge whether MacDonald’s alarm was justified — whether the world was, in fact, headed to cataclysm.
  • If Charney’s group confirmed that the world was careering toward an existential crisis, the president would be forced to act.
  • Hansen turned from the moon to Venus. Why, he tried to determine, was its surface so hot? In 1967, a Soviet satellite beamed back the answer: The planet’s atmosphere was mainly carbon dioxide. Though once it may have had habitable temperatures, it was believed to have succumbed to a runaway greenhouse effect: As the sun grew brighter, Venus’s ocean began to evaporate, thickening the atmosphere, which forced yet greater evaporation — a self-perpetuating cycle that finally boiled off the ocean entirely and heated the planet’s surface to more than 800 degrees Fahrenheit
  • At the other extreme, Mars’s thin atmosphere had insufficient carbon dioxide to trap much heat at all, leaving it about 900 degrees colder. Earth lay in the middle, its Goldilocks greenhouse effect just strong enough to support life.
  • We want to learn more about Earth’s climate, Jim told Anniek — and how humanity can influence it. He would use giant new supercomputers to map the planet’s atmosphere. They would create Mirror Worlds: parallel realities that mimicked our own. These digital simulacra, technically called “general circulation models,” combined the mathematical formulas that governed the behavior of the sea, land and sky into a single computer model. Unlike the real world, they could be sped forward to reveal the future.
  • The government officials, many of them scientists themselves, tried to suppress their awe of the legends in their presence: Henry Stommel, the world’s leading oceanographer; his protégé, Carl Wunsch, a Jason; the Manhattan Project alumnus Cecil Leith; the Harvard planetary physicist Richard Goody. These were the men who, in the last three decades, had discovered foundational principles underlying the relationships among sun, atmosphere, land and ocean — which is to say, the climate.
  • When, at Charney’s request, Hansen programmed his model to consider a future of doubled carbon dioxide, it predicted a temperature increase of four degrees Celsius. That was twice as much warming as the prediction made by the most prominent climate modeler, Syukuro Manabe, whose government lab at Princeton was the first to model the greenhouse effect. The difference between the two predictions — between warming of two degrees Celsius and four degrees Celsius — was the difference between damaged coral reefs and no reefs whatsoever, between thinning forests and forests enveloped by desert, between catastrophe and chaos.
  • The discrepancy between the models, Arakawa concluded, came down to ice and snow. The whiteness of the world’s snowfields reflected light; if snow melted in a warmer climate, less radiation would escape the atmosphere, leading to even greater warming. Shortly before dawn, Arakawa concluded that Manabe had given too little weight to the influence of melting sea ice, while Hansen had overemphasized it. The best estimate lay in between. Which meant that the Jasons’ calculation was too optimistic. When carbon dioxide doubled in 2035 or thereabouts, global temperatures would increase between 1.5 and 4.5 degrees Celsius, with the most likely outcome a warming of three degrees.
  • within the highest levels of the federal government, the scientific community and the oil-and-gas industry — within the commonwealth of people who had begun to concern themselves with the future habitability of the planet — the Charney report would come to have the authority of settled fact. It was the summation of all the predictions that had come before, and it would withstand the scrutiny of the decades that followed it. Charney’s group had considered everything known about ocean, sun, sea, air and fossil fuels and had distilled it to a single number: three. When the doubling threshold was broached, as appeared inevitable, the world would warm three degrees Celsius
  • The last time the world was three degrees warmer was during the Pliocene, three million years ago, when beech trees grew in Antarctica, the seas were 80 feet higher and horses galloped across the Canadian coast of the Arctic Ocean.
  • After the publication of the Charney report, Exxon decided to create its own dedicated carbon-dioxide research program, with an annual budget of $600,000. Only Exxon was asking a slightly different question than Jule Charney. Exxon didn’t concern itself primarily with how much the world would warm. It wanted to know how much of the warming Exxon could be blamed for.
  • “It behooves us to start a very aggressive defensive program,” Shaw wrote in a memo to a manager, “because there is a good probability that legislation affecting our business will be passed.”
  • Shaw turned to Wallace Broecker, a Columbia University oceanographer who was the second author of Roger Revelle’s 1965 carbon-dioxide report for Lyndon Johnson. In 1977, in a presentation at the American Geophysical Union, Broecker predicted that fossil fuels would have to be restricted, whether by taxation or fiat. More recently, he had testified before Congress, calling carbon dioxide “the No.1 long-term environmental problem.” If presidents and senators trusted Broecker to tell them the bad news, he was good enough for Exxon.
  • The company had been studying the carbon-dioxide problem for decades, since before it changed its name to Exxon. In 1957, scientists from Humble Oil published a study tracking “the enormous quantity of carbon dioxide” contributed to the atmosphere since the Industrial Revolution “from the combustion of fossil fuels.” Even then, the observation that burning fossil fuels had increased the concentration of carbon in the atmosphere was well understood and accepted by Humble’s scientists.
  • The American Petroleum Institute, the industry’s largest trade association, asked the same question in 1958 through its air-pollution study group and replicated the findings made by Humble Oil. So did another A.P.I. study conducted by the Stanford Research Institute a decade later, in 1968, which concluded that the burning of fossil fuels would bring “significant temperature changes” by the year 2000 and ultimately “serious worldwide environmental changes,” including the melting of the Antarctic ice cap and rising seas.
  • The ritual repeated itself every few years. Industry scientists, at the behest of their corporate bosses, reviewed the problem and found good reasons for alarm and better excuses to do nothing. Why should they act when almost nobody within the United States government — nor, for that matter, within the environmental movement — seemed worried?
  • Why take on an intractable problem that would not be detected until this generation of employees was safely retired? Worse, the solutions seemed more punitive than the problem itself. Historically, energy use had correlated to economic growth — the more fossil fuels we burned, the better our lives became. Why mess with that?
  • That June, Jimmy Carter signed the Energy Security Act of 1980, which directed the National Academy of Sciences to start a multiyear, comprehensive study, to be called “Changing Climate,” that would analyze social and economic effects of climate change. More urgent, the National Commission on Air Quality, at the request of Congress, invited two dozen experts, including Henry Shaw himself, to a meeting in Florida to propose climate policy.
  • On April 3, 1980, Senator Paul Tsongas, a Massachusetts Democrat, held the first congressional hearing on carbon-dioxide buildup in the atmosphere. Gordon MacDonald testified that the United States should “take the initiative” and develop, through the United Nations, a way to coordinate every nation’s energy policies to address the problem.
  • During the expansion of the Clean Air Act, he pushed for the creation of the National Commission on Air Quality, charged with ensuring that the goals of the act were being met. One such goal was a stable global climate. The Charney report had made clear that goal was not being met, and now the commission wanted to hear proposals for legislation. It was a profound responsibility, and the two dozen experts invited to the Pink Palace — policy gurus, deep thinkers, an industry scientist and an environmental activist — had only three days to achieve it, but the utopian setting made everything seem possible
  • We have less time than we realize, said an M.I.T. nuclear engineer named David Rose, who studied how civilizations responded to large technological crises. “People leave their problems until the 11th hour, the 59th minute,” he said. “And then: ‘Eloi, Eloi, Lama Sabachthani?’ ” — “My God, my God, why hast thou forsaken me?”
  • The attendees seemed to share a sincere interest in finding solutions. They agreed that some kind of international treaty would ultimately be needed to keep atmospheric carbon dioxide at a safe level. But nobody could agree on what that level was.
  • William Elliott, a NOAA scientist, introduced some hard facts: If the United States stopped burning carbon that year, it would delay the arrival of the doubling threshold by only five years. If Western nations somehow managed to stabilize emissions, it would forestall the inevitable by only eight years. The only way to avoid the worst was to stop burning coal. Yet China, the Soviet Union and the United States, by far the world’s three largest coal producers, were frantically accelerating extraction.
  • “Do we have a problem?” asked Anthony Scoville, a congressional science consultant. “We do, but it is not the atmospheric problem. It is the political problem.” He doubted that any scientific report, no matter how ominous its predictions, would persuade politicians to act.
  • The talk of ending oil production stirred for the first time the gentleman from Exxon. “I think there is a transition period,” Henry Shaw said. “We are not going to stop burning fossil fuels and start looking toward solar or nuclear fusion and so on. We are going to have a very orderly transition from fossil fuels to renewable energy sources.”
  • What if the problem was that they were thinking of it as a problem? “What I am saying,” Scoville continued, “is that in a sense we are making a transition not only in energy but the economy as a whole.” Even if the coal and oil industries collapsed, renewable technologies like solar energy would take their place. Jimmy Carter was planning to invest $80 billion in synthetic fuel. “My God,” Scoville said, “with $80 billion, you could have a photovoltaics industry going that would obviate the need for synfuels forever!”
  • nobody could agree what to do. John Perry, a meteorologist who had worked as a staff member on the Charney report, suggested that American energy policy merely “take into account” the risks of global warming, though he acknowledged that a nonbinding measure might seem “intolerably stodgy.” “It is so weak,” Pomerance said, the air seeping out of him, “as to not get us anywhere.”
  • Scoville pointed out that the United States was responsible for the largest share of global carbon emissions. But not for long. “If we’re going to exercise leadership,” he said, “the opportunity is now.
  • One way to lead, he proposed, would be to classify carbon dioxide as a pollutant under the Clean Air Act and regulate it as such. This was received by the room like a belch. By Scoville’s logic, every sigh was an act of pollution. Did the science really support such an extreme measure? The Charney report did exactly that, Pomerance said.
  • Slade, the director of the Energy Department’s carbon-dioxide program, considered the lag a saving grace. If changes did not occur for a decade or more, he said, those in the room couldn’t be blamed for failing to prevent them. So what was the problem?
  • “Call it whatever.” Besides, Pomerance added, they didn’t have to ban coal tomorrow. A pair of modest steps could be taken immediately to show the world that the United States was serious: the implementation of a carbon tax and increased investment in renewable energy. Then the United States could organize an international summit meeting to address climate change
  • these two dozen experts, who agreed on the major points and had made a commitment to Congress, could not draft a single paragraph. Hours passed in a hell of fruitless negotiation, self-defeating proposals and impulsive speechifying. Pomerance and Scoville pushed to include a statement calling for the United States to “sharply accelerate international dialogue,” but they were sunk by objections and caveats.
  • They never got to policy proposals. They never got to the second paragraph. The final statement was signed by only the moderator, who phrased it more weakly than the declaration calling for the workshop in the first place. “The guide I would suggest,” Jorling wrote, “is whether we know enough not to recommend changes in existing policy.”
  • Pomerance had seen enough. A consensus-based strategy would not work — could not work — without American leadership. And the United States wouldn’t act unless a strong leader persuaded it to do so — someone who would speak with authority about the science, demand action from those in power and risk everything in pursuit of justice.
  • The meeting ended Friday morning. On Tuesday, four days later, Ronald Reagan was elected president.
  • ‘Otherwise, They’ll Gurgle’ November 1980-September 1981
  • In the midst of this carnage, the Council on Environmental Quality submitted a report to the White House warning that fossil fuels could “permanently and disastrously” alter Earth’s atmosphere, leading to “a warming of the Earth, possibly with very serious effects.” Reagan did not act on the council’s advice. Instead, his administration considered eliminating the council.
  • After the election, Reagan considered plans to close the Energy Department, increase coal production on federal land and deregulate surface coal mining. Once in office, he appointed James Watt, the president of a legal firm that fought to open public lands to mining and drilling, to run the Interior Department. “We’re deliriously happy,” the president of the National Coal Association was reported to have said. Reagan preserved the E.P.A. but named as its administrator Anne Gorsuch, an anti-regulation zealot who proceeded to cut the agency’s staff and budget by about a quarter
  • Reagan “has declared open war on solar energy,” the director of the nation’s lead solar-energy research agency said, after he was asked to resign). Reagan appeared determined to reverse the environmental achievements of Jimmy Carter, before undoing those of Richard Nixon, Lyndon Johnson, John F. Kennedy and, if he could get away with it, Theodore Roosevelt.
  • When Reagan considered closing the Council on Environmental Quality, its acting chairman, Malcolm Forbes Baldwin, wrote to the vice president and the White House chief of staff begging them to reconsider; in a major speech the same week, “A Conservative’s Program for the Environment,” Baldwin argued that it was “time for today’s conservatives explicitly to embrace environmentalism.” Environmental protection was not only good sense. It was good business. What could be more conservative than an efficient use of resources that led to fewer federal subsidies?
  • Meanwhile the Charney report continued to vibrate at the periphery of public consciousness. Its conclusions were confirmed by major studies from the Aspen Institute, the International Institute for Applied Systems Analysis near Vienna and the American Association for the Advancement of Science. Every month or so, nationally syndicated articles appeared summoning apocalypse: “Another Warning on ‘Greenhouse Effect,’ ” “Global Warming Trend ‘Beyond Human Experience,’ ” “Warming Trend Could ‘Pit Nation Against Nation.’
  • Pomerance read on the front page of The New York Times on Aug. 22, 1981, about a forthcoming paper in Science by a team of seven NASA scientists. They had found that the world had already warmed in the past century. Temperatures hadn’t increased beyond the range of historical averages, but the scientists predicted that the warming signal would emerge from the noise of routine weather fluctuations much sooner than previously expected. Most unusual of all, the paper ended with a policy recommendation: In the coming decades, the authors wrote, humankind should develop alternative sources of energy and use fossil fuels only “as necessary.” The lead author was James Hansen.
  • Pomerance listened and watched. He understood Hansen’s basic findings well enough: Earth had been warming since 1880, and the warming would reach “almost unprecedented magnitude” in the next century, leading to the familiar suite of terrors, including the flooding of a 10th of New Jersey and a quarter of Louisiana and Florida. But Pomerance was excited to find that Hansen could translate the complexities of atmospheric science into plain English.
  • 7. ‘We’re All Going to Be the Victims’ March 1982
  • Gore had learned about climate change a dozen years earlier as an undergraduate at Harvard, when he took a class taught by Roger Revelle. Humankind was on the brink of radically transforming the global atmosphere, Revelle explained, drawing Keeling’s rising zigzag on the blackboard, and risked bringing about the collapse of civilization. Gore was stunned: Why wasn’t anyone talking about this?
  • Most in Congress considered the science committee a legislative backwater, if they considered it at all; this made Gore’s subcommittee, which had no legislative authority, an afterthought to an afterthought. That, Gore vowed, would change. Environmental and health stories had all the elements of narrative drama: villains, victims and heroes. In a hearing, you could summon all three, with the chairman serving as narrator, chorus and moral authority. He told his staff director that he wanted to hold a hearing every week.
  • The Revelle hearing went as Grumbly had predicted. The urgency of the issue was lost on Gore’s older colleagues, who drifted in and out while the witnesses testified. There were few people left by the time the Brookings Institution economist Lester Lave warned that humankind’s profligate exploitation of fossil fuels posed an existential test to human nature. “Carbon dioxide stands as a symbol now of our willingness to confront the future,” he said. “It will be a sad day when we decide that we just don’t have the time or thoughtfulness to address those issues.”
  • That night, the news programs featured the resolution of the baseball strike, the ongoing budgetary debate and the national surplus of butter.
  • There emerged, despite the general comity, a partisan divide. Unlike the Democrats, the Republicans demanded action. “Today I have a sense of déjà vu,” said Robert Walker, a Republican from Pennsylvania. In each of the last five years, he said, “we have been told and told and told that there is a problem with the increasing carbon dioxide in the atmosphere. We all accept that fact, and we realize that the potential consequences are certainly major in their impact on mankind.” Yet they had failed to propose a single law. “Now is the time,” he said. “The research is clear. It is up to us now to summon the political will.”
  • Hansen flew to Washington to testify on March 25, 1982, performing before a gallery even more thinly populated than at Gore’s first hearing on the greenhouse effect. Gore began by attacking the Reagan administration for cutting funding for carbon-dioxide research despite the “broad consensus in the scientific community that the greenhouse effect is a reality.” William Carney, a Republican from New York, bemoaned the burning of fossil fuels and argued passionately that science should serve as the basis for legislative policy
  • the experts invited by Gore agreed with the Republicans: The science was certain enough. Melvin Calvin, a Berkeley chemist who won the Nobel Prize for his work on the carbon cycle, said that it was useless to wait for stronger evidence of warming. “You cannot do a thing about it when the signals are so big that they come out of the noise,” he said. “You have to look for early warning signs.”
  • Hansen’s job was to share the warning signs, to translate the data into plain English. He explained a few discoveries that his team had made — not with computer models but in libraries. By analyzing records from hundreds of weather stations, he found that the surface temperature of the planet had already increased four-tenths of a degree Celsius in the previous century. Data from several hundred tide-gauge stations showed that the oceans had risen four inches since the 1880s
  • It occurred to Hansen that this was the only political question that mattered: How long until the worst began? It was not a question on which geophysicists expended much effort; the difference between five years and 50 years in the future was meaningless in geologic time. Politicians were capable of thinking only in terms of electoral time: six years, four years, two years. But when it came to the carbon problem, the two time schemes were converging.
  • “Within 10 or 20 years,” Hansen said, “we will see climate changes which are clearly larger than the natural variability.” James Scheuer wanted to make sure he understood this correctly. No one else had predicted that the signal would emerge that quickly. “If it were one or two degrees per century,” he said, “that would be within the range of human adaptability. But we are pushing beyond the range of human adaptability.” “Yes,” Hansen said.
  • How soon, Scheuer asked, would they have to change the national model of energy production? Hansen hesitated — it wasn’t a scientific question. But he couldn’t help himself. He had been irritated, during the hearing, by all the ludicrous talk about the possibility of growing more trees to offset emissions. False hopes were worse than no hope at all: They undermined the prospect of developing real solutions. “That time is very soon,” Hansen said finally. “My opinion is that it is past,” Calvin said, but he was not heard because he spoke from his seat. He was told to speak into the microphone. “It is already later,” Calvin said, “than you think.”
  • From Gore’s perspective, the hearing was an unequivocal success. That night Dan Rather devoted three minutes of “CBS Evening News” to the greenhouse effect. A correspondent explained that temperatures had increased over the previous century, great sheets of pack ice in Antarctica were rapidly melting, the seas were rising; Calvin said that “the trend is all in the direction of an impending catastrophe”; and Gore mocked Reagan for his shortsightedness. Later, Gore could take credit for protecting the Energy Department’s carbon-dioxide program, which in the end was largely preserved.
  • 8. ‘The Direction of an Impending Catastrophe’ 1982
  • Following Henry Shaw’s recommendation to establish credibility ahead of any future legislative battles, Exxon had begun to spend conspicuously on global-warming research. It donated tens of thousands of dollars to some of the most prominent research efforts, including one at Woods Hole led by the ecologist George Woodwell, who had been calling for major climate policy as early as the mid-1970s, and an international effort coordinated by the United Nations. Now Shaw offered to fund the October 1982 symposium on climate change at Columbia’s Lamont-Doherty campus.
  • David boasted that Exxon would usher in a new global energy system to save the planet from the ravages of climate change. He went so far as to argue that capitalism’s blind faith in the wisdom of the free market was “less than satisfying” when it came to the greenhouse effect. Ethical considerations were necessary, too. He pledged that Exxon would revise its corporate strategy to account for climate change, even if it were not “fashionable” to do so. As Exxon had already made heavy investments in nuclear and solar technology, he was “generally upbeat” that Exxon would “invent” a future of renewable energy.
  • Hansen had reason to feel upbeat himself. If the world’s largest oil-and-gas company supported a new national energy model, the White House would not stand in its way. The Reagan administration was hostile to change from within its ranks. But it couldn’t be hostile to Exxon.
  • The carbon-dioxide issue was beginning to receive major national attention — Hansen’s own findings had become front-page news, after all. What started as a scientific story was turning into a political story.
  • The political realm was itself a kind of Mirror World, a parallel reality that crudely mimicked our own. It shared many of our most fundamental laws, like the laws of gravity and inertia and publicity. And if you applied enough pressure, the Mirror World of politics could be sped forward to reveal a new future. Hansen was beginning to understand that too.
  • 1. ‘Caution, Not Panic’ 1983-1984
  • in the fall of 1983, the climate issue entered an especially long, dark winter. And all because of a single report that had done nothing to change the state of climate science but transformed the state of climate politics.
  • After the publication of the Charney report in 1979, Jimmy Carter had directed the National Academy of Sciences to prepare a comprehensive, $1 million analysis of the carbon-dioxide problem: a Warren Commission for the greenhouse effect. A team of scientist-dignitaries — among them Revelle, the Princeton modeler Syukuro Manabe and the Harvard political economist Thomas Schelling, one of the intellectual architects of Cold War game theory — would review the literature, evaluate the consequences of global warming for the world order and propose remedies
  • Then Reagan won the White House.
  • the incipient report served as the Reagan administration’s answer to every question on the subject. There could be no climate policy, Fred Koomanoff and his associates said, until the academy ruled. In the Mirror World of the Reagan administration, the warming problem hadn’t been abandoned at all. A careful, comprehensive solution was being devised. Everyone just had to wait for the academy’s elders to explain what it was.
  • The committee’s chairman, William Nierenberg — a Jason, presidential adviser and director of Scripps, the nation’s pre-eminent oceanographic institution — argued that action had to be taken immediately, before all the details could be known with certainty, or else it would be too late.
  • Better to bet on American ingenuity to save the day. Major interventions in national energy policy, taken immediately, might end up being more expensive, and less effective, than actions taken decades in the future, after more was understood about the economic and social consequences of a warmer planet. Yes, the climate would change, mostly for the worst, but future generations would be better equipped to change with it.
  • Government officials who knew Nierenberg were not surprised by his conclusions: He was an optimist by training and experience, a devout believer in the doctrine of American exceptionalism, one of the elite class of scientists who had helped the nation win a global war, invent the most deadly weapon conceivable and create the booming aerospace and computer industries. America had solved every existential problem it had confronted over the previous generation; it would not be daunted by an excess of carbon dioxide. Nierenberg had also served on Reagan’s transition team. Nobody believed that he had been directly influenced by his political connections, but his views — optimistic about the saving graces of market forces, pessimistic about the value of government regulation — reflected all the ardor of his party.
  • That’s what Nierenberg wrote in “Changing Climate.” But it’s not what he said in the press interviews that followed. He argued the opposite: There was no urgent need for action. The public should not entertain the most “extreme negative speculations” about climate change (despite the fact that many of those speculations appeared in his report). Though “Changing Climate” urged an accelerated transition to renewable fuels, noting that it would take thousands of years for the atmosphere to recover from the damage of the last century, Nierenberg recommended “caution, not panic.” Better to wait and see
  • The damage of “Changing Climate” was squared by the amount of attention it received. Nierenberg’s speech in the Great Hall, being one-500th the length of the actual assessment, received 500 times the press coverage. As The Wall Street Journal put it, in a line echoed by trade journals across the nation: “A panel of top scientists has some advice for people worried about the much-publicized warming of the Earth’s climate: You can cope.”
  • On “CBS Evening News,” Dan Rather said the academy had given “a cold shoulder” to a grim, 200-page E.P.A. assessment published earlier that week (titled “Can We Delay a Greenhouse Warming?”; the E.P.A.’s answer, reduced to a word, was no). The Washington Post described the two reports, taken together, as “clarion calls to inaction.
  • George Keyworth II, Reagan’s science adviser. Keyworth used Nierenberg’s optimism as reason to discount the E.P.A.’s “unwarranted and unnecessarily alarmist” report and warned against taking any “near-term corrective action” on global warming. Just in case it wasn’t clear, Keyworth added, “there are no actions recommended other than continued research.”
  • Edward David Jr., two years removed from boasting of Exxon’s commitment to transforming global energy policy, told Science that the corporation had reconsidered. “Exxon has reverted to being mainly a supplier of conventional hydrocarbon fuels — petroleum products, natural gas and steam coal,” David said. The American Petroleum Institute canceled its own carbon-dioxide research program, too.
  • Exxon soon revised its position on climate-change research. In a presentation at an industry conference, Henry Shaw cited “Changing Climate” as evidence that “the general consensus is that society has sufficient time to technologically adapt to a CO₂ greenhouse effect.” If the academy had concluded that regulations were not a serious option, why should Exxon protest
  • 2. ‘You Scientists Win’ 1985
  • 3. The Size of The Human Imagination Spring-Summer 1986
  • Curtis Moore’s proposal: Use ozone to revive climate. The ozone hole had a solution — an international treaty, already in negotiation. Why not hitch the milk wagon to the bullet train? Pomerance was skeptical. The problems were related, sure: Without a reduction in CFC emissions, you didn’t have a chance of averting cataclysmic global warming. But it had been difficult enough to explain the carbon issue to politicians and journalists; why complicate the sales pitch? Then again, he didn’t see what choice he had. The Republicans controlled the Senate, and Moore was his connection to the Senate’s environmental committee.
  • Pomerance met with Senator John Chafee, a Republican from Rhode Island, and helped persuade him to hold a double-barreled hearing on the twin problems of ozone and carbon dioxide on June 10 and 11, 1986
  • F.Sherwood Rowland, Robert Watson, a NASA scientist, and Richard Benedick, the administration’s lead representative in international ozone negotiations, would discuss ozone; James Hansen, Al Gore, the ecologist George Woodwell and Carl Wunsch, a veteran of the Charney group, would testify about climate change.
  • As Pomerance had hoped, fear about the ozone layer ensured a bounty of press coverage for the climate-change testimony. But as he had feared, it caused many people to conflate the two crises. One was Peter Jennings, who aired the video on ABC’s “World News Tonight,” warning that the ozone hole “could lead to flooding all over the world, also to drought and to famine.”
  • The confusion helped: For the first time since the “Changing Climate” report, global-warming headlines appeared by the dozen. William Nierenberg’s “caution, not panic” line was inverted. It was all panic without a hint of caution: “A Dire Forecast for ‘Greenhouse’ Earth” (the front page of The Washington Post); “Scientists Predict Catastrophes in Growing Global Heat Wave” (Chicago Tribune); “Swifter Warming of Globe Foreseen” (The New York Times).
  • After three years of backsliding and silence, Pomerance was exhilarated to see interest in the issue spike overnight. Not only that: A solution materialized, and a moral argument was passionately articulated — by Rhode Island’s Republican senator no less. “Ozone depletion and the greenhouse effect can no longer be treated solely as important scientific questions,” Chafee said. “They must be seen as critical problems facing the nations of the world, and they are problems that demand solutions.”
  • The old canard about the need for more research was roundly mocked — by Woodwell, by a W.R.I. colleague named Andrew Maguire, by Senator George Mitchell, a Democrat from Maine. “Scientists are never 100 percent certain,” the Princeton historian Theodore Rabb testified. “That notion of total certainty is something too elusive ever to be sought.” As Pomerance had been saying since 1979, it was past time to act. Only now the argument was so broadly accepted that nobody dared object.
  • The ozone hole, Pomerance realized, had moved the public because, though it was no more visible than global warming, people could be made to see it. They could watch it grow on video. Its metaphors were emotionally wrought: Instead of summoning a glass building that sheltered plants from chilly weather (“Everything seems to flourish in there”), the hole evoked a violent rending of the firmament, inviting deathly radiation. Americans felt that their lives were in danger. An abstract, atmospheric problem had been reduced to the size of the human imagination. It had been made just small enough, and just large enough, to break through.
  • Four years after “Changing Climate,” two years after a hole had torn open the firmament and a month after the United States and more than three dozen other nations signed a treaty to limit use of CFCs, the climate-change corps was ready to celebrate. It had become conventional wisdom that climate change would follow ozone’s trajectory. Reagan’s E.P.A. administrator, Lee M. Thomas, said as much the day he signed the Montreal Protocol on Substances That Deplete the Ozone Layer (the successor to the Vienna Convention), telling reporters that global warming was likely to be the subject of a future international agreement
  • Congress had already begun to consider policy — in 1987 alone, there were eight days of climate hearings, in three committees, across both chambers of Congress; Senator Joe Biden, a Delaware Democrat, had introduced legislation to establish a national climate-change strategy. And so it was that Jim Hansen found himself on Oct. 27 in the not especially distinguished ballroom of the Quality Inn on New Jersey Avenue, a block from the Capitol, at “Preparing for Climate Change,” which was technically a conference but felt more like a wedding.
  • John Topping was an old-line Rockefeller Republican, a Commerce Department lawyer under Nixon and an E.P.A. official under Reagan. He first heard about the climate problem in the halls of the E.P.A. in 1982 and sought out Hansen, who gave him a personal tutorial. Topping was amazed to discover that out of the E.P.A.’s 13,000-person staff, only seven people, by his count, were assigned to work on climate, though he figured it was more important to the long-term security of the nation than every other environmental issue combined.
  • Glancing around the room, Jim Hansen could chart, like an arborist counting rings on a stump, the growth of the climate issue over the decade. Veterans like Gordon MacDonald, George Woodwell and the environmental biologist Stephen Schneider stood at the center of things. Former and current staff members from the congressional science committees (Tom Grumbly, Curtis Moore, Anthony Scoville) made introductions to the congressmen they advised. Hansen’s owlish nemesis Fred Koomanoff was present, as were his counterparts from the Soviet Union and Western Europe. Rafe Pomerance’s cranium could be seen above the crowd, but unusually he was surrounded by colleagues from other environmental organizations that until now had shown little interest in a diffuse problem with no proven fund-raising record. The party’s most conspicuous newcomers, however, the outermost ring, were the oil-and-gas executives.
  • That evening, as a storm spat and coughed outside, Rafe Pomerance gave one of his exhortative speeches urging cooperation among the various factions, and John Chafee and Roger Revelle received awards; introductions were made and business cards earnestly exchanged. Not even a presentation by Hansen of his research could sour the mood. The next night, on Oct. 28, at a high-spirited dinner party in Topping’s townhouse on Capitol Hill, the oil-and-gas men joked with the environmentalists, the trade-group representatives chatted up the regulators and the academics got merrily drunk. Mikhail Budyko, the don of the Soviet climatologists, settled into an extended conversation about global warming with Topping’s 10-year-old son. It all seemed like the start of a grand bargain, a uniting of factions — a solution.
  • Hansen was accustomed to the bureaucratic nuisances that attended testifying before Congress; before a hearing, he had to send his formal statement to NASA headquarters, which forwarded it to the White House’s Office of Management and Budget for approval. “Major greenhouse climate changes are a certainty,” he had written. “By the 2010s [in every scenario], essentially the entire globe has very substantial warming.”
  • By all appearances, plans for major policy continued to advance rapidly. After the Johnston hearing, Timothy Wirth, a freshman Democratic senator from Colorado on the energy committee, began to plan a comprehensive package of climate-change legislation — a New Deal for global warming. Wirth asked a legislative assistant, David Harwood, to consult with experts on the issue, beginning with Rafe Pomerance, in the hope of converting the science of climate change into a new national energy policy.
  • In March 1988, Wirth joined 41 other senators, nearly half of them Republicans, to demand that Reagan call for an international treaty modeled after the ozone agreement. Because the United States and the Soviet Union were the world’s two largest contributors of carbon emissions, responsible for about one-third of the world total, they should lead the negotiations. Reagan agreed. In May, he signed a joint statement with Mikhail Gorbachev that included a pledge to cooperate on global warming.
  • Al Gore himself had, for the moment, withdrawn his political claim to the issue. In 1987, at the age of 39, Gore announced that he was running for president, in part to bring attention to global warming, but he stopped emphasizing it after the subject failed to captivate New Hampshire primary voters.
  • 5. ‘You Will See Things That You Shall Believe’ Summer 1988
  • It was the hottest and driest summer in history. Everywhere you looked, something was bursting into flames. Two million acres in Alaska incinerated, and dozens of major fires scored the West. Yellowstone National Park lost nearly one million acres. Smoke was visible from Chicago, 1,600 miles away.
  • In Nebraska, suffering its worst drought since the Dust Bowl, there were days when every weather station registered temperatures above 100 degrees. The director of the Kansas Department of Health and Environment warned that the drought might be the dawning of a climatic change that within a half century could turn the state into a desert.
  • On June 22 in Washington, where it hit 100 degrees, Rafe Pomerance received a call from Jim Hansen, who was scheduled to testify the following morning at a Senate hearing called by Timothy Wirth. “I hope we have good media coverage tomorrow,” Hansen said.
  • Hansen had just received the most recent global temperature data. Just over halfway into the year, 1988 was setting records. Already it had nearly clinched the hottest year in history. Ahead of schedule, the signal was emerging from the noise. “I’m going to make a pretty strong statement,” Hansen said.
  • Hansen returned to his testimony. He wrote: “The global warming is now large enough that we can ascribe with a high degree of confidence a cause-and-effect relationship to the greenhouse effect.” He wrote: “1988 so far is so much warmer than 1987, that barring a remarkable and improbable cooling, 1988 will be the warmest year on record.” He wrote: “The greenhouse effect has been detected, and it is changing our climate now.”
  • “We have only one planet,” Senator Bennett Johnston intoned. “If we screw it up, we have no place to go.” Senator Max Baucus, a Democrat from Montana, called for the United Nations Environment Program to begin preparing a global remedy to the carbon-dioxide problem. Senator Dale Bumpers, a Democrat of Arkansas, previewed Hansen’s testimony, saying that it “ought to be cause for headlines in every newspaper in America tomorrow morning.” The coverage, Bumpers emphasized, was a necessary precursor to policy. “Nobody wants to take on any of the industries that produce the things that we throw up into the atmosphere,” he said. “But what you have are all these competing interests pitted against our very survival.”
  • Hansen, wiping his brow, spoke without affect, his eyes rarely rising from his notes. The warming trend could be detected “with 99 percent confidence,” he said. “It is changing our climate now.” But he saved his strongest comment for after the hearing, when he was encircled in the hallway by reporters. “It is time to stop waffling so much,” he said, “and say that the evidence is pretty strong that the greenhouse effect is here.”
  • The press followed Bumpers’s advice. Hansen’s testimony prompted headlines in dozens of newspapers across the country, including The New York Times, which announced, across the top of its front page: “Global Warming Has Begun, Expert Tells Senate.”
  • Rafe Pomerance called his allies on Capitol Hill, the young staff members who advised politicians, organized hearings, wrote legislation. We need to finalize a number, he told them, a specific target, in order to move the issue — to turn all this publicity into policy. The Montreal Protocol had called for a 50 percent reduction in CFC emissions by 1998. What was the right target for carbon emissions? It wasn’t enough to exhort nations to do better. That kind of talk might sound noble, but it didn’t change investments or laws. They needed a hard goal — something ambitious but reasonable. And they needed it soon: Just four days after Hansen’s star turn, politicians from 46 nations and more than 300 scientists would convene in Toronto at the World Conference on the Changing Atmosphere, an event described by Philip Shabecoff of The New York Times as “Woodstock for climate change.”
  • Pomerance had a proposal: a 20 percent reduction in carbon emissions by 2000. Ambitious, Harwood said. In all his work planning climate policy, he had seen no assurance that such a steep drop in emissions was possible. Then again, 2000 was more than a decade off, so it allowed for some flexibility.
  • Mintzer pointed out that a 20 percent reduction was consistent with the academic literature on energy efficiency. Various studies over the years had shown that you could improve efficiency in most energy systems by roughly 20 percent if you adopted best practices.
  • Of course, with any target, you had to take into account the fact that the developing world would inevitably consume much larger quantities of fossil fuels by 2000. But those gains could be offset by a wider propagation of the renewable technologies already at hand — solar, wind, geothermal. It was not a rigorous scientific analysis, Mintzer granted, but 20 percent sounded plausible. We wouldn’t need to solve cold fusion or ask Congress to repeal the law of gravity. We could manage it with the knowledge and technology we already had.
  • Besides, Pomerance said, 20 by 2000 sounds good.
  • The conference’s final statement, signed by all 400 scientists and politicians in attendance, repeated the demand with a slight variation: a 20 percent reduction in carbon emissions by 2005. Just like that, Pomerance’s best guess became global diplomatic policy.
  • Hansen, emerging from Anniek’s successful cancer surgery, took it upon himself to start a one-man public information campaign. He gave news conferences and was quoted in seemingly every article about the issue; he even appeared on television with homemade props. Like an entrant at an elementary-school science fair, he made “loaded dice” out of sections of cardboard and colored paper to illustrate the increased likelihood of hotter weather in a warmer climate. Public awareness of the greenhouse effect reached a new high of 68 percent
  • global warming became a major subject of the presidential campaign. While Michael Dukakis proposed tax incentives to encourage domestic oil production and boasted that coal could satisfy the nation’s energy needs for the next three centuries, George Bush took advantage. “I am an environmentalist,” he declared on the shore of Lake Erie, the first stop on a five-state environmental tour that would take him to Boston Harbor, Dukakis’s home turf. “Those who think we are powerless to do anything about the greenhouse effect,” he said, “are forgetting about the White House effect.”
  • His running mate emphasized the ticket’s commitment to the issue at the vice-presidential debate. “The greenhouse effect is an important environmental issue,” Dan Quayle said. “We need to get on with it. And in a George Bush administration, you can bet that we will.”
  • This kind of talk roused the oil-and-gas men. “A lot of people on the Hill see the greenhouse effect as the issue of the 1990s,” a gas lobbyist told Oil & Gas Journal. Before a meeting of oil executives shortly after the “environmentalist” candidate won the election, Representative Dick Cheney, a Wyoming Republican, warned, “It’s going to be very difficult to fend off some kind of gasoline tax.” The coal industry, which had the most to lose from restrictions on carbon emissions, had moved beyond denial to resignation. A spokesman for the National Coal Association acknowledged that the greenhouse effect was no longer “an emerging issue. It is here already, and we’ll be hearing more and more about it.”
  • By the end of the year, 32 climate bills had been introduced in Congress, led by Wirth’s omnibus National Energy Policy Act of 1988. Co-sponsored by 13 Democrats and five Republicans, it established as a national goal an “International Global Agreement on the Atmosphere by 1992,” ordered the Energy Department to submit to Congress a plan to reduce energy use by at least 2 percent a year through 2005 and directed the Congressional Budget Office to calculate the feasibility of a carbon tax. A lawyer for the Senate energy committee told an industry journal that lawmakers were “frightened” by the issue and predicted that Congress would eventually pass significant legislation after Bush took office
  • The other great powers refused to wait. The German Parliament created a special commission on climate change, which concluded that action had to be taken immediately, “irrespective of any need for further research,” and that the Toronto goal was inadequate; it recommended a 30 percent reduction of carbon emissions
  • Margaret Thatcher, who had studied chemistry at Oxford, warned in a speech to the Royal Society that global warming could “greatly exceed the capacity of our natural habitat to cope” and that “the health of the economy and the health of our environment are totally dependent upon each other.”
  • The prime ministers of Canada and Norway called for a binding international treaty on the atmosphere; Sweden’s Parliament went further, announcing a national strategy to stabilize emissions at the 1988 level and eventually imposing a carbon tax
  • the United Nations unanimously endorsed the establishment, by the World Meteorological Organization and the United Nations Environment Program, of an Intergovernmental Panel on Climate Change, composed of scientists and policymakers, to conduct scientific assessments and develop global climate policy.
  • One of the I.P.C.C.’s first sessions to plan an international treaty was hosted by the State Department, 10 days after Bush’s inauguration. James Baker chose the occasion to make his first speech as secretary of state. “We can probably not afford to wait until all of the uncertainties about global climate change have been resolved,” he said. “Time will not make the problem go away.”
  • : On April 14, 1989, a bipartisan group of 24 senators, led by the majority leader, George Mitchell, requested that Bush cut emissions in the United States even before the I.P.C.C.’s working group made its recommendation. “We cannot afford the long lead times associated with a comprehensive global agreement,” the senators wrote. Bush had promised to combat the greenhouse effect with the White House effect. The self-proclaimed environmentalist was now seated in the Oval Office. It was time.
  • 8. ‘You Never Beat The White House’ April 1989
  • After Jim Baker gave his boisterous address to the I.P.C.C. working group at the State Department, he received a visit from John Sununu, Bush’s chief of staff. Leave the science to the scientists, Sununu told Baker. Stay clear of this greenhouse-effect nonsense. You don’t know what you’re talking about. Baker, who had served as Reagan’s chief of staff, didn’t speak about the subject again.
  • despite his reputation as a political wolf, he still thought of himself as a scientist — an “old engineer,” as he was fond of putting it, having earned a Ph.D. in mechanical engineering from M.I.T. decades earlier. He lacked the reflexive deference that so many of his political generation reserved for the class of elite government scientists.
  • Since World War II, he believed, conspiratorial forces had used the imprimatur of scientific knowledge to advance an “anti-growth” doctrine. He reserved particular disdain for Paul Ehrlich’s “The Population Bomb,” which prophesied that hundreds of millions of people would starve to death if the world took no step to curb population growth; the Club of Rome, an organization of European scientists, heads of state and economists, which similarly warned that the world would run out of natural resources; and as recently as the mid-’70s, the hypothesis advanced by some of the nation’s most celebrated scientists — including Carl Sagan, Stephen Schneider and Ichtiaque Rasool — that a new ice age was dawning, thanks to the proliferation of man-made aerosols. All were theories of questionable scientific merit, portending vast, authoritarian remedies to halt economic progress.
  • When Mead talked about “far-reaching” decisions and “long-term consequences,” Sununu heard the marching of jackboots.
  • Sununu had suspected that the greenhouse effect belonged to this nefarious cabal since 1975, when the anthropologist Margaret Mead convened a symposium on the subject at the National Institute of Environmental Health Sciences.
  • While Sununu and Darman reviewed Hansen’s statements, the E.P.A. administrator, William K. Reilly, took a new proposal to the White House. The next meeting of the I.P.C.C.’s working group was scheduled for Geneva the following month, in May; it was the perfect occasion, Reilly argued, to take a stronger stand on climate change. Bush should demand a global treaty to reduce carbon emissions.
  • Sununu wouldn’t budge. He ordered the American delegates not to make any commitment in Geneva. Very soon after that, someone leaked the exchange to the press.
  • A deputy of Jim Baker pulled Reilly aside. He said he had a message from Baker, who had observed Reilly’s infighting with Sununu. “In the long run,” the deputy warned Reilly, “you never beat the White House.”
  • 9. ‘A Form of Science Fraud’ May 1989
  • The cameras followed Hansen and Gore into the marbled hallway. Hansen insisted that he wanted to focus on the science. Gore focused on the politics. “I think they’re scared of the truth,” he said. “They’re scared that Hansen and the other scientists are right and that some dramatic policy changes are going to be needed, and they don’t want to face up to it.”
  • The censorship did more to publicize Hansen’s testimony and the dangers of global warming than anything he could have possibly said. At the White House briefing later that morning, Press Secretary Marlin Fitzwater admitted that Hansen’s statement had been changed. He blamed an official “five levels down from the top” and promised that there would be no retaliation. Hansen, he added, was “an outstanding and distinguished scientist” and was “doing a great job.”
  • 10. The White House Effect Fall 1989
  • The Los Angeles Times called the censorship “an outrageous assault.” The Chicago Tribune said it was the beginning of “a cold war on global warming,” and The New York Times warned that the White House’s “heavy-handed intervention sends the signal that Washington wants to go slow on addressing the greenhouse problem.”
  • Darman went to see Sununu. He didn’t like being accused of censoring scientists. They needed to issue some kind of response. Sununu called Reilly to ask if he had any ideas. We could start, Reilly said, by recommitting to a global climate treaty. The United States was the only Western nation on record as opposing negotiations.
  • Sununu sent a telegram to Geneva endorsing a plan “to develop full international consensus on necessary steps to prepare for a formal treaty-negotiating process. The scope and importance of this issue are so great that it is essential for the U.S. to exercise leadership.”
  • Sununu seethed at any mention of the subject. He had taken it upon himself to study more deeply the greenhouse effect; he would have a rudimentary, one-dimensional general circulation model installed on his personal desktop computer. He decided that the models promoted by Jim Hansen were a lot of bunk. They were horribly imprecise in scale and underestimated the ocean’s ability to mitigate warming. Sununu complained about Hansen to D. Allan Bromley, a nuclear physicist from Yale who, at Sununu’s recommendation, was named Bush’s science adviser. Hansen’s findings were “technical poppycock” that didn’t begin to justify such wild-eyed pronouncements that “the greenhouse effect is here” or that the 1988 heat waves could be attributed to global warming, let alone serve as the basis for national economic policy.
  • When a junior staff member in the Energy Department, in a meeting at the White House with Sununu and Reilly, mentioned an initiative to reduce fossil-fuel use, Sununu interrupted her. “Why in the world would you need to reduce fossil-fuel use?” he asked. “Because of climate change,” the young woman replied. “I don’t want anyone in this administration without a scientific background using ‘climate change’ or ‘global warming’ ever again,” he said. “If you don’t have a technical basis for policy, don’t run around making decisions on the basis of newspaper headlines.” After the meeting, Reilly caught up to the staff member in the hallway. She was shaken. Don’t take it personally, Reilly told her. Sununu might have been looking at you, but that was directed at me.
  • Reilly, for his part, didn’t entirely blame Sununu for Bush’s indecision on the prospect of a climate treaty. The president had never taken a vigorous interest in global warming and was mainly briefed about it by nonscientists. Bush had brought up the subject on the campaign trail, in his speech about the White House effect, after leafing through a briefing booklet for a new issue that might generate some positive press. When Reilly tried in person to persuade him to take action, Bush deferred to Sununu and Baker. Why don’t the three of you work it out, he said. Let me know when you decide
  • Relations between Sununu and Reilly became openly adversarial. Reilly, Sununu thought, was a creature of the environmental lobby. He was trying to impress his friends at the E.P.A. without having a basic grasp of the science himself.
  • Pomerance had the sinking feeling that the momentum of the previous year was beginning to flag. The censoring of Hansen’s testimony and the inexplicably strident opposition from John Sununu were ominous signs. So were the findings of a report Pomerance had commissioned, published in September by the World Resources Institute, tracking global greenhouse-gas emissions. The United States was the largest contributor by far, producing nearly a quarter of the world’s carbon emissions, and its contribution was growing faster than that of every other country. Bush’s indecision, or perhaps inattention, had already managed to delay the negotiation of a global climate treaty until 1990 at the earliest, perhaps even 1991. By then, Pomerance worried, it would be too late.
  • Pomerance tried to be more diplomatic. “The president made a commitment to the American people to deal with global warming,” he told The Washington Post, “and he hasn’t followed it up.” He didn’t want to sound defeated. “There are some good building blocks here,” Pomerance said, and he meant it. The Montreal Protocol on CFCs wasn’t perfect at first, either — it had huge loopholes and weak restrictions. Once in place, however, the restrictions could be tightened. Perhaps the same could happen with climate change. Perhaps. Pomerance was not one for pessimism. As William Reilly told reporters, dutifully defending the official position forced upon him, it was the first time that the United States had formally endorsed the concept of an emissions limit. Pomerance wanted to believe that this was progress.
  • All week in Noordwijk, Becker couldn’t stop talking about what he had seen in Zeeland. After a flood in 1953, when the sea swallowed much of the region, killing more than 2,000 people, the Dutch began to build the Delta Works, a vast concrete-and-steel fortress of movable barriers, dams and sluice gates — a masterpiece of human engineering. The whole system could be locked into place within 90 minutes, defending the land against storm surge. It reduced the country’s exposure to the sea by 700 kilometers, Becker explained. The United States coastline was about 153,000 kilometers long. How long, he asked, was the entire terrestrial coastline? Because the whole world was going to need this. In Zeeland, he said, he had seen the future.
  • Ken Caldeira, a climate scientist at the Carnegie Institution for Science in Stanford, Calif., has a habit of asking new graduate students to name the largest fundamental breakthrough in climate physics since 1979. It’s a trick question. There has been no breakthrough. As with any mature scientific discipline, there is only refinement. The computer models grow more precise; the regional analyses sharpen; estimates solidify into observational data. Where there have been inaccuracies, they have tended to be in the direction of understatement.
  • More carbon has been released into the atmosphere since the final day of the Noordwijk conference, Nov. 7, 1989, than in the entire history of civilization preceding it
  • Despite every action taken since the Charney report — the billions of dollars invested in research, the nonbinding treaties, the investments in renewable energy — the only number that counts, the total quantity of global greenhouse gas emitted per year, has continued its inexorable rise.
  • When it comes to our own nation, which has failed to make any binding commitments whatsoever, the dominant narrative for the last quarter century has concerned the efforts of the fossil-fuel industries to suppress science, confuse public knowledge and bribe politicians.
  • The mustache-twirling depravity of these campaigns has left the impression that the oil-and-gas industry always operated thus; while the Exxon scientists and American Petroleum Institute clerics of the ’70s and ’80s were hardly good Samaritans, they did not start multimillion-dollar disinformation campaigns, pay scientists to distort the truth or try to brainwash children in elementary schools, as their successors would.
  • It was James Hansen’s testimony before Congress in 1988 that, for the first time since the “Changing Climate” report, made oil-and-gas executives begin to consider the issue’s potential to hurt their profits. Exxon, as ever, led the field. Six weeks after Hansen’s testimony, Exxon’s manager of science and strategy development, Duane LeVine, prepared an internal strategy paper urging the company to “emphasize the uncertainty in scientific conclusions.” This shortly became the default position of the entire sector. LeVine, it so happened, served as chairman of the global petroleum industry’s Working Group on Global Climate Change, created the same year, which adopted Exxon’s position as its own
  • The American Petroleum Institute, after holding a series of internal briefings on the subject in the fall and winter of 1988, including one for the chief executives of the dozen or so largest oil companies, took a similar, if slightly more diplomatic, line. It set aside money for carbon-dioxide policy — about $100,000, a fraction of the millions it was spending on the health effects of benzene, but enough to establish a lobbying organization called, in an admirable flourish of newspeak, the Global Climate Coalition.
  • The G.C.C. was conceived as a reactive body, to share news of any proposed regulations, but on a whim, it added a press campaign, to be coordinated mainly by the A.P.I. It gave briefings to politicians known to be friendly to the industry and approached scientists who professed skepticism about global warming. The A.P.I.’s payment for an original op-ed was $2,000.
  • It was joined by the U.S. Chamber of Commerce and 14 other trade associations, including those representing the coal, electric-grid and automobile industries
  • In October 1989, scientists allied with the G.C.C. began to be quoted in national publications, giving an issue that lacked controversy a convenient fulcrum. “Many respected scientists say the available evidence doesn’t warrant the doomsday warnings,” was the caveat that began to appear in articles on climate change.
  • The following year, when President Bill Clinton proposed an energy tax in the hope of meeting the goals of the Rio treaty, the A.P.I. invested $1.8 million in a G.C.C. disinformation campaign. Senate Democrats from oil-and-coal states joined Republicans to defeat the tax proposal, which later contributed to the Republicans’ rout of Democrats in the midterm congressional elections in 1994 — the first time the Republican Party had won control of both houses in 40 years
  • The G.C.C. spent $13 million on a single ad campaign intended to weaken support for the 1997 Kyoto Protocol, which committed its parties to reducing greenhouse-gas emissions by 5 percent relative to 1990 levels. The Senate, which would have had to ratify the agreement, took a pre-emptive vote declaring its opposition; the resolution passed 95-0. There has never been another serious effort to negotiate a binding global climate treaty.
  • . This has made the corporation an especially vulnerable target for the wave of compensatory litigation that began in earnest in the last three years and may last a generation. Tort lawsuits have become possible only in recent years, as scientists have begun more precisely to attribute regional effects to global emission levels. This is one subfield of climate science that has advanced significantly sin
  • Pomerance had not been among the 400 delegates invited to Noordwijk. But together with three young activists — Daniel Becker of the Sierra Club, Alden Meyer of the Union of Concerned Scientists and Stewart Boyle from Friends of the Earth — he had formed his own impromptu delegation. Their constituency, they liked to say, was the climate itself. Their mission was to pressure the delegates to include in the final conference statement, which would be used as the basis for a global treaty, the target proposed in Toronto: a 20 percent reduction of greenhouse-gas combustion by 2005. It was the only measure that mattered, the amount of emissions reductions, and the Toronto number was the strongest global target yet proposed.
  • The delegations would review the progress made by the I.P.C.C. and decide whether to endorse a framework for a global treaty. There was a general sense among the delegates that they would, at minimum, agree to the target proposed by the host, the Dutch environmental minister, more modest than the Toronto number: a freezing of greenhouse-gas emissions at 1990 levels by 2000. Some believed that if the meeting was a success, it would encourage the I.P.C.C. to accelerate its negotiations and reach a decision about a treaty sooner. But at the very least, the world’s environmental ministers should sign a statement endorsing a hard, binding target of emissions reductions. The mood among the delegates was electric, nearly giddy — after more than a decade of fruitless international meetings, they could finally sign an agreement that meant something.
  • 11. ‘The Skunks at The Garden Party’ November 1989
  • It was nearly freezing — Nov. 6, 1989, on the coast of the North Sea in the Dutch resort town of Noordwijk
  • Losing Earth: The Decade WeAlmost Stopped Climate Change We knew everything we needed to know, and nothing stood in our way. Nothing, that is, except ourselves. A tragedy in two acts. By Nathaniel RichPhotographs and Videos by George Steinmetz AUG. 1, 2018
Javier E

Facebook, Google, and the Death of the Public Square - The Atlantic - 0 views

  • Beyond the Areopagitica’s condemnation of censorship, Milton was really defending the underlying spiritual and intellectual chaos, and the institutions that nourished it. In his lifetime, the printing press had changed everything.
  • He accorded books religious significance, which was really the highest compliment he could offer, since he took his religion so seriously: “Who kills a man kills a reasonable creature, God’s image; but he who destroys a good book, kills reason itself, kills the image of God, as it were, in the eye ...
  • At the core, Milton was defending something intensely private—the conscience, the freedom of each citizen to arrive at their own religious conviction. “Give me the liberty to know, to utter, and to argue freely according to conscience, above all liberties.
  • ...25 more annotations...
  • But Milton also stirringly articulated  how the formation of private convictions required public spaces, public institutions—what Jürgen Habermas so famously defined as the “public sphere.”
  • By the time, he wrote Areopagitica, it was robust: coffee houses, newspapers, book publishers and bookstores, theatres, and meeting places—the locales that allowed individuals to come together to form a public. These were spaces largely outside the grasp of church and state—and, in fact, many of these institutions emerged with the express purpose of liberating society from the grasp of church and state.
  • Nobody designed the public sphere from a dorm room or a Silicon Valley garage. It just started to organically accrete, as printed volumes began to pile, as liberal ideas gained currency and made space for even more liberal ideas. Institutions grew, and then over the centuries acquired prestige and authority. Newspapers and journals evolved into what we call media. Book publishing emerged from the printing guilds, and eventually became taste-making, discourse-shaping enterprises. What was born in Milton’s lifetime lasted until our own.
  • It took centuries for the public sphere to develop—and the technology companies have eviscerated it in a flash. By radically remaking the advertising business and commandeering news distribution, Google and Facebook have damaged the economics of journalism. Amazon has thrashed the bookselling business in the U.S. They have shredded old ideas about intellectual property—which had provided the economic and philosophical basis for authorship
  • Big tech has made a fetish of efficiency, of data, of the wisdom of the market. These are the underlying principles that explain why Google returns such terrible responses to the God query. Google is merely giving us what’s popular, what’s most clicked upon, not what’s worthy
  • This assault on the public sphere is an assault on free expression. In the West, free expression is a transcendent right only in theory—in practice its survival is contingent and tenuous.
  • We’re witnessing the way in which public conversation is subverted by name-calling and harassment. We can convince ourselves that these are fringe characteristics of social media, but social media has implanted such tendencies at the core of the culture. They are in fact practiced by mainstream journalists, mobs of the well meaning, and the president of the United States. The toxicity of the environment shreds the quality of conversation and deters meaningful participation in it
  • it becomes harder and harder to cling to the idea of the rational individual, formulating opinions on the basis of conscience. And as we lose faith in that principle, the public will lose faith in the necessity of preserving the protections of free speech.
  • The public sphere was always rife with manipulation—political persuasion, after all, involves a healthy dose of emotionalism and the tapping of submerged biases
  • humankind is entering into an era where manipulation has grown simultaneously invisible, terrifyingly precise, and embedded in everyday life.
  • And now, the tech giants are racing to insert themselves more intimately in people’s lives, this time as  personal assistants. The tech companies want us to tie ourselves closely to their machines
  • These machines don’t present us with choices. They aren’t designed to present us with a healthy menu of options. They anticipate our wants and needs, even our informational and cultural wants and needs.
  • What’s so pernicious about these machines is that they weaponize us against ourselves. They take our data—everywhere we have traveled on the web, every query we’ve entered into Google, even the posts we begin to write but never publish—and exploit this knowledge to reduce us to marionettes
  • To state the obvious, these are multinational corporations, with an ultimate interest in their bottom lines. They will never be capable of regulating the public sphere that they control in any name other than their own profit.
  • the Facebook CEO supplied a response that befuddled the senator: “I think the real question, as the internet becomes more important in people's lives, is what is the right regulation, not whether there should be or not.”
  • now that these companies bestride the markets in which they roam, the primary danger they face isn’t meddling regulators or hyperactive legislators. What the behemoths of Silicon Valley truly fear is the possibility that antitrust laws will be deploye
  • tech companies carry a very different sort of cargo—they trade in the commodities of speech. Once we extend the state into this realm, we’re entering danger territory.
  • We don’t need to use our imaginations here. There are examples all over the world—in Russia, in China—where governments have made their peace with social media, by setting the terms that govern it. These regimes permit a cacophony of ideas, except for the ones that truly challenge political power.
  • The present global explosion of anxiety and hate is unlike anything most of us have ever witnessed. People don’t know how to confront these evils
  • In the face of such menace, it’s natural to appeal to a higher power for protection—but in our panic we need to be clear about which threats are genuine and which are merely rhetorical. And panic shouldn’t lead us to seek protection that inadvertently squashes our own liberties.
  • Silicon Valley doesn’t understand truth as a quest or trial, but as an engineering challenge. They believe human behavior and human choices can be predicted by algorithms on the basis of past behavior
  • They believe that our lives can be programmed to be more efficient. By steering and nudging us, by designing the architecture of our decision-making process, they claim to be relieving of us of the burden of choice. Silicon Valley talks endlessly about the virtues of the frictionless life.
  • As we join Zuckerberg’s community, he fantasizes that the sense of connection will cause our differences to melt away—like a digital version of the old Coca Cola commercial, or, as I argue in my book, World Without Mind, a revival of the sixties counterculture and the vision of life on a commune.
  • In other words, preservation of democracy requires preserving this ecosystem of ideas that has miraculously persisted with us since the 17th century. People can’t afford to be seduced by the false prophets of disruption, the charlatans who argue that we abandon old wisdoms in the face of new gadgets
  • We need to shape the culture so that the prestige of engineering doesn’t continue to come at the expense of the humanities. We need to preserve literature as a primary technology for interrogating the meaning of life. We need to resist the tendency to reduce the world to data.
Javier E

America has no real public health system - coronavirus has a clear run | Robert Reich |... - 0 views

  • Fauci, director of the National Institute of Allergy and Infectious Diseases and just about the only official in the Trump administration trusted to tell the truth about the coronavirus, said last Thursday: “The system does not, is not really geared to what we need right now … It is a failing, let’s admit it.”
  • The system would be failing even under a halfway competent president. The dirty little secret, which will soon become apparent to all, is that there is no real public health system in the United States.
  • America is waking up to the fact that it has almost no public capacity to deal with it.
  • ...12 more annotations...
  • Instead of a public health system, we have a private for-profit system for individuals lucky enough to afford it and a rickety social insurance system for people fortunate enough to have a full-time job.
  • At their best, both systems respond to the needs of individuals rather than the needs of the public as a whole
  • In America, the word “public” – as in public health, public education or public welfare – means a sum total of individual needs, not the common good.
  • Almost 30% of American workers have no paid sick leave from their employers, including 70% of low-income workers earning less than $10.49 an hour. Vast numbers of self-employed workers cannot afford sick leave. Friday’s deal between House Democrats and the White House won’t have much effect because it exempts large employers and offers waivers to smaller ones.
  • there are no institutions analogous to the Fed with responsibility for overseeing and managing the public’s health – able to whip out a giant checkbook at a moment’s notice to prevent human, rather than financial, devastation
  • Even if a test for the Covid-19 virus had been developed and approved in time, no institutions are in place to administer it to tens of millions of Americans free of charge.
  • Healthcare in America is delivered mainly by private for-profit corporations which, unlike financial institutions, are not required to maintain reserve capacity.
  • Its 45,000 intensive care unit beds fall woefully short of the 2.9 million likely to be needed.
  • Contrast this with America’s financial system. The Federal Reserve concerns itself with the health of financial markets as a whole. Late last week the Fed made $1.5tn available to banks, at the slightest hint of difficulties making trades. No one batted an eye.
  • more than 30 million Americans have no health insurance. Eligibility for Medicaid, food stamps and other public assistance is now linked to having or actively looking for work.
  • In Los Angeles, about 80% of students qualify for free or reduced lunches and just under 20,000 are homeless at some point during the school year.
  • here is no public health system in the US, in short, because the richest nation in the world has no capacity to protect the public as a whole, apart from national defense
Javier E

When the New York Times lost its way - 0 views

  • There are many reasons for Trump’s ascent, but changes in the American news media played a critical role. Trump’s manipulation and every one of his political lies became more powerful because journalists had forfeited what had always been most valuable about their work: their credibility as arbiters of truth and brokers of ideas, which for more than a century, despite all of journalism’s flaws and failures, had been a bulwark of how Americans govern themselves.
  • I think Sulzberger shares this analysis. In interviews and his own writings, including an essay earlier this year for the Columbia Journalism Review, he has defended “independent journalism”, or, as I understand him, fair-minded, truth-seeking journalism that aspires to be open and objective.
  • It’s good to hear the publisher speak up in defence of such values, some of which have fallen out of fashion not just with journalists at the Times and other mainstream publications but at some of the most prestigious schools of journalism.
  • ...204 more annotations...
  • All the empathy and humility in the world will not mean much against the pressures of intolerance and tribalism without an invaluable quality that Sulzberger did not emphasise: courage.
  • Sulzberger seems to underestimate the struggle he is in, that all journalism and indeed America itself is in
  • In describing the essential qualities of independent journalism in his essay, he unspooled a list of admirable traits – empathy, humility, curiosity and so forth. These qualities have for generations been helpful in contending with the Times’s familiar problem, which is liberal bias
  • on their own, these qualities have no chance against the Times’s new, more dangerous problem, which is in crucial respects the opposite of the old one.
  • The Times’s problem has metastasised from liberal bias to illiberal bias, from an inclination to favour one side of the national debate to an impulse to shut debate down altogether
  • the internet knocked the industry off its foundations. Local newspapers were the proving ground between college campuses and national newsrooms. As they disintegrated, the national news media lost a source of seasoned reporters and many Americans lost a journalism whose truth they could verify with their own eyes.
  • far more than when I set out to become a journalist, doing the work right today demands a particular kind of courage:
  • the moral and intellectual courage to take the other side seriously and to report truths and ideas that your own side demonises for fear they will harm its cause.
  • One of the glories of embracing illiberalism is that, like Trump, you are always right about everything, and so you are justified in shouting disagreement down.
  • leaders of many workplaces and boardrooms across America find that it is so much easier to compromise than to confront – to give a little ground today in the belief you can ultimately bring people around
  • This is how reasonable Republican leaders lost control of their party to Trump and how liberal-minded college presidents lost control of their campuses. And it is why the leadership of the New York Times is losing control of its principles.
  • Over the decades the Times and other mainstream news organisations failed plenty of times to live up to their commitments to integrity and open-mindedness. The relentless struggle against biases and preconceptions, rather than the achievement of a superhuman objective omniscience, is what mattered
  • . I thought, and still think, that no American institution could have a better chance than the Times, by virtue of its principles, its history, its people and its hold on the attention of influential Americans, to lead the resistance to the corruption of political and intellectual life, to overcome the encroaching dogmatism and intolerance.
  • As the country became more polarised, the national media followed the money by serving partisan audiences the versions of reality they preferred
  • This relationship proved self-reinforcing. As Americans became freer to choose among alternative versions of reality, their polarisation intensified.
  • as the top editors let bias creep into certain areas of coverage, such as culture, lifestyle and business, that made the core harder to defend and undermined the authority of even the best reporters.
  • here have been signs the Times is trying to recover the courage of its convictions
  • The paper was slow to display much curiosity about the hard question of the proper medical protocols for trans children; but once it did, the editors defended their coverage against the inevitable criticism.
  • As Sulzberger told me in the past, returning to the old standards will require agonising change. He saw that as the gradual work of many years, but I think he is mistaken. To overcome the cultural and commercial pressures the Times faces, particularly given the severe test posed by another Trump candidacy and possible presidency, its publisher and senior editors will have to be bolder than that.
  • As a Democrat from a family of Democrats, a graduate of Yale and a blossom of the imagined meritocracy, I had my first real chance, at Buchanan’s rallies, to see the world through the eyes of stalwart opponents of abortion, immigration and the relentlessly rising tide of modernity.
  • the Times is failing to face up to one crucial reason: that it has lost faith in Americans, too.
  • For now, to assert that the Times plays by the same rules it always has is to commit a hypocrisy that is transparent to conservatives, dangerous to liberals and bad for the country as a whole.
  • It makes the Times too easy for conservatives to dismiss and too easy for progressives to believe.
  • The reality is that the Times is becoming the publication through which America’s progressive elite talks to itself about an America that does not really exist.
  • It is hard to imagine a path back to saner American politics that does not traverse a common ground of shared fact.
  • It is equally hard to imagine how America’s diversity can continue to be a source of strength, rather than become a fatal flaw, if Americans are afraid or unwilling to listen to each other.
  • I suppose it is also pretty grandiose to think you might help fix all that. But that hope, to me, is what makes journalism worth doing.
  • Since Adolph Ochs bought the paper in 1896, one of the most inspiring things the Times has said about itself is that it does its work “without fear or favour”. That is not true of the institution today – it cannot be, not when its journalists are afraid to trust readers with a mainstream conservative argument such as Cotton’s, and its leaders are afraid to say otherwise.
  • Most important, the Times, probably more than any other American institution, could influence the way society approached debate and engagement with opposing views. If Times Opinion demonstrated the same kind of intellectual courage and curiosity that my colleagues at the Atlantic had shown, I hoped, the rest of the media would follow.
  • You did not have to go along with everything that any tribe said. You did not have to pretend that the good guys, much as you might have respected them, were right about everything, or that the bad guys, much as you might have disdained them, never had a point. You did not, in other words, ever have to lie.
  • This fundamental honesty was vital for readers, because it equipped them to make better, more informed judgments about the world. Sometimes it might shock or upset them by failing to conform to their picture of reality. But it also granted them the respect of acknowledging that they were able to work things out for themselves.
  • The Atlantic did not aspire to the same role as the Times. It did not promise to serve up the news of the day without any bias. But it was to opinion journalism what the Times’s reporting was supposed to be to news: honest and open to the world.
  • Those were the glory days of the blog, and we hit on the idea of creating a living op-ed page, a collective of bloggers with different points of view but a shared intellectual honesty who would argue out the meaning of the news of the day
  • They were brilliant, gutsy writers, and their disagreements were deep enough that I used to joke that my main work as editor was to prevent fistfights.
  • Under its owner, David Bradley, my colleagues and I distilled our purpose as publishing big arguments about big ideas
  • we also began producing some of the most important work in American journalism: Nicholas Carr on whether Google was “making us stupid”; Hanna Rosin on “the end of men”; Taylor Branch on “the shame of college sports”; Ta-Nehisi Coates on “the case for reparations”; Greg Lukianoff and Jonathan Haidt on “the coddling of the American mind”.
  • I was starting to see some effects of the new campus politics within the Atlantic. A promising new editor had created a digital form for aspiring freelancers to fill out, and she wanted to ask them to disclose their racial and sexual identity. Why? Because, she said, if we were to write about the trans community, for example, we would ask a trans person to write the story
  • There was a good argument for that, I acknowledged, and it sometimes might be the right answer. But as I thought about the old people, auto workers and abortion opponents I had learned from, I told her there was also an argument for correspondents who brought an outsider’s ignorance, along with curiosity and empathy, to the story.
  • A journalism that starts out assuming it knows the answers, it seemed to me then, and seems even more so to me now, can be far less valuable to the reader than a journalism that starts out with a humbling awareness that it knows nothing.
  • In the age of the internet it is hard even for a child to sustain an “innocent eye”, but the alternative for journalists remains as dangerous as ever, to become propagandists. America has more than enough of those already.
  • When I looked around the Opinion department, change was not what I perceived. Excellent writers and editors were doing excellent work. But the department’s journalism was consumed with politics and foreign affairs in an era when readers were also fascinated by changes in technology, business, science and culture.
  • Fairly quickly, though, I realised two things: first, that if I did my job as I thought it should be done, and as the Sulzbergers said they wanted me to do it, I would be too polarising internally ever to lead the newsroom; second, that I did not want that job, though no one but my wife believed me when I said that.
  • there was a compensating moral and psychological privilege that came with aspiring to journalistic neutrality and open-mindedness, despised as they might understandably be by partisans. Unlike the duelling politicians and advocates of all kinds, unlike the corporate chieftains and their critics, unlike even the sainted non-profit workers, you did not have to pretend things were simpler than they actually were
  • On the right and left, America’s elites now talk within their tribes, and get angry or contemptuous on those occasions when they happen to overhear the other conclave. If they could be coaxed to agree what they were arguing about, and the rules by which they would argue about it, opinion journalism could serve a foundational need of the democracy by fostering diverse and inclusive debate. Who could be against that?
  • The large staff of op-ed editors contained only a couple of women. Although the 11 columnists were individually admirable, only two of them were women and only one was a person of colour
  • Not only did they all focus on politics and foreign affairs, but during the 2016 campaign, no columnist shared, in broad terms, the worldview of the ascendant progressives of the Democratic Party, incarnated by Bernie Sanders. And only two were conservative.
  • This last fact was of particular concern to the elder Sulzberger. He told me the Times needed more conservative voices, and that its own editorial line had become predictably left-wing. “Too many liberals,” read my notes about the Opinion line-up from a meeting I had with him and Mark Thompson, then the chief executive, as I was preparing to rejoin the paper. “Even conservatives are liberals’ idea of a conservative.” The last note I took from that meeting was: “Can’t ignore 150m conservative Americans.”
  • As I knew from my time at the Atlantic, this kind of structural transformation can be frightening and even infuriating for those understandably proud of things as they are. It is hard on everyone
  • experience at the Atlantic also taught me that pursuing new ways of doing journalism in pursuit of venerable institutional principles created enthusiasm for change. I expected that same dynamic to allay concerns at the Times.
  • If Opinion published a wider range of views, it would help frame a set of shared arguments that corresponded to, and drew upon, the set of shared facts coming from the newsroom.
  • New progressive voices were celebrated within the Times. But in contrast to the Wall Street Journal and the Washington Post, conservative voices – even eloquent anti-Trump conservative voices – were despised, regardless of how many leftists might surround them.
  • The Opinion department mocked the paper’s claim to value diversity. It did not have a single black editor
  • Eventually, it sank in that my snotty joke was actually on me: I was the one ignorantly fighting a battle that was already lost. The old liberal embrace of inclusive debate that reflected the country’s breadth of views had given way to a new intolerance for the opinions of roughly half of American voters.
  • Out of naivety or arrogance, I was slow to recognise that at the Times, unlike at the Atlantic, these values were no longer universally accepted, let alone esteemed
  • After the 9/11 attacks, as the bureau chief in Jerusalem, I spent a lot of time in the Gaza Strip interviewing Hamas leaders, recruiters and foot soldiers, trying to understand and describe their murderous ideology. Some readers complained that I was providing a platform for terrorists, but there was never any objection from within the Times.
  • Our role, we knew, was to help readers understand such threats, and this required empathetic – not sympathetic – reporting. This is not an easy distinction but good reporters make it: they learn to understand and communicate the sources and nature of a toxic ideology without justifying it, much less advocating it.
  • Today’s newsroom turns that moral logic on its head, at least when it comes to fellow Americans. Unlike the views of Hamas, the views of many Americans have come to seem dangerous to engage in the absence of explicit condemnation
  • Focusing on potential perpetrators – “platforming” them by explaining rather than judging their views – is believed to empower them to do more harm.
  • After the profile of the Ohio man was published, media Twitter lit up with attacks on the article as “normalising” Nazism and white nationalism, and the Times convulsed internally. The Times wound up publishing a cringing editor’s note that hung the writer out to dry and approvingly quoted some of the criticism, including a tweet from a Washington Post opinion editor asking, “Instead of long, glowing profiles of Nazis/White nationalists, why don’t we profile the victims of their ideologies”?
  • the Times lacked the confidence to defend its own work
  • The editor’s note paraded the principle of publishing such pieces, saying it was important to “shed more light, not less, on the most extreme corners of American life”. But less light is what the readers got. As a reporter in the newsroom, you’d have to have been an idiot after that explosion to attempt such a profile
  • Empathetic reporting about Trump supporters became even more rare. It became a cliché among influential left-wing columnists and editors that blinkered political reporters interviewed a few Trump supporters in diners and came away suckered into thinking there was something besides racism that could explain anyone’s support for the man.
  • After a year spent publishing editorials attacking Trump and his policies, I thought it would be a demonstration of Timesian open-mindedness to give his supporters their say. Also, I thought the letters were interesting, so I turned over the entire editorial page to the Trump letters.
  • I wasn’t surprised that we got some criticism on Twitter. But I was astonished by the fury of my Times colleagues. I found myself facing an angry internal town hall, trying to justify what to me was an obvious journalistic decision
  • Didn’t he think other Times readers should understand the sources of Trump’s support? Didn’t he also see it was a wonderful thing that some Trump supporters did not just dismiss the Times as fake news, but still believed in it enough to respond thoughtfully to an invitation to share their views?
  • And if the Times could not bear to publish the views of Americans who supported Trump, why should it be surprised that those voters would not trust it?
  • Two years later, in 2020, Baquet acknowledged that in 2016 the Times had failed to take seriously the idea that Trump could become president partly because it failed to send its reporters out into America to listen to voters and understand “the turmoil in the country”. And, he continued, the Times still did not understand the views of many Americans
  • Speaking four months before we published the Cotton op-ed, he said that to argue that the views of such voters should not appear in the Times was “not journalistic”.
  • Conservative arguments in the Opinion pages reliably started uproars within the Times. Sometimes I would hear directly from colleagues who had the grace to confront me with their concerns; more often they would take to the company’s Slack channels or Twitter to advertise their distress in front of each other
  • This environment of enforced group-think, inside and outside the paper, was hard even on liberal opinion writers. One left-of-centre columnist told me that he was reluctant to appear in the New York office for fear of being accosted by colleagues.
  • An internal survey shortly after I left the paper found that barely half the staff, within an enterprise ostensibly devoted to telling the truth, agreed “there is a free exchange of views in this company” and “people are not afraid to say what they really think”.)
  • Even columnists with impeccable leftist bona fides recoiled from tackling subjects when their point of view might depart from progressive orthodoxy.
  • The bias had become so pervasive, even in the senior editing ranks of the newsroom, as to be unconscious
  • Trying to be helpful, one of the top newsroom editors urged me to start attaching trigger warnings to pieces by conservatives. It had not occurred to him how this would stigmatise certain colleagues, or what it would say to the world about the Times’s own bias
  • By their nature, information bubbles are powerfully self-reinforcing, and I think many Times staff have little idea how closed their world has become, or how far they are from fulfilling their compact with readers to show the world “without fear or favour”
  • sometimes the bias was explicit: one newsroom editor told me that, because I was publishing more conservatives, he felt he needed to push his own department further to the left.
  • The Times’s failure to honour its own stated principles of openness to a range of views was particularly hard on the handful of conservative writers, some of whom would complain about being flyspecked and abused by colleagues. One day when I relayed a conservative’s concern about double standards to Sulzberger, he lost his patience. He told me to inform the complaining conservative that that’s just how it was: there was a double standard and he should get used to it.
  • A publication that promises its readers to stand apart from politics should not have different standards for different writers based on their politics. But I delivered the message. There are many things I regret about my tenure as editorial-page editor. That is the only act of which I am ashamed.
  • I began to think of myself not as a benighted veteran on a remote island, but as Rip Van Winkle. I had left one newspaper, had a pleasant dream for ten years, and returned to a place I barely recognised.
  • The new New York Times was the product of two shocks – sudden collapse, and then sudden success. The paper almost went bankrupt during the financial crisis, and the ensuing panic provoked a crisis of confidence among its leaders. Digital competitors like the HuffPost were gaining readers and winning plaudits within the media industry as innovative. They were the cool kids; Times folk were ink-stained wrinklies.
  • In its panic, the Times bought out experienced reporters and editors and began hiring journalists from publications like the HuffPost who were considered “digital natives” because they had never worked in print. This hiring quickly became easier, since most digital publications financed by venture capital turned out to be bad businesses
  • Though they might have lacked deep or varied reporting backgrounds, some of the Times’s new hires brought skills in video and audio; others were practised at marketing themselves – building their brands, as journalists now put it – in social media. Some were brilliant and fiercely honest, in keeping with the old aspirations of the paper.
  • critically, the Times abandoned its practice of acculturation, including those months-long assignments on Metro covering cops and crime or housing. Many new hires who never spent time in the streets went straight into senior writing and editing roles.
  • All these recruits arrived with their own notions of the purpose of the Times. To me, publishing conservatives helped fulfil the paper’s mission; to them, I think, it betrayed that mission.
  • then, to the shock and horror of the newsroom, Trump won the presidency. In his article for Columbia Journalism Review, Sulzberger cites the Times’s failure to take Trump’s chances seriously as an example of how “prematurely shutting down inquiry and debate” can allow “conventional wisdom to ossify in a way that blinds society.
  • Many Times staff members – scared, angry – assumed the Times was supposed to help lead the resistance. Anxious for growth, the Times’s marketing team implicitly endorsed that idea, too.
  • As the number of subscribers ballooned, the marketing department tracked their expectations, and came to a nuanced conclusion. More than 95% of Times subscribers described themselves as Democrats or independents, and a vast majority of them believed the Times was also liberal
  • A similar majority applauded that bias; it had become “a selling point”, reported one internal marketing memo. Yet at the same time, the marketers concluded, subscribers wanted to believe that the Times was independent.
  • As that memo argued, even if the Times was seen as politically to the left, it was critical to its brand also to be seen as broadening its readers’ horizons, and that required “a perception of independence”.
  • Readers could cancel their subscriptions if the Times challenged their worldview by reporting the truth without regard to politics. As a result, the Times’s long-term civic value was coming into conflict with the paper’s short-term shareholder value
  • The Times has every right to pursue the commercial strategy that makes it the most money. But leaning into a partisan audience creates a powerful dynamic. Nobody warned the new subscribers to the Times that it might disappoint them by reporting truths that conflicted with their expectations
  • When your product is “independent journalism”, that commercial strategy is tricky, because too much independence might alienate your audience, while too little can lead to charges of hypocrisy that strike at the heart of the brand.
  • It became one of Dean Baquet’s frequent mordant jokes that he missed the old advertising-based business model, because, compared with subscribers, advertisers felt so much less sense of ownership over the journalism
  • The Times was slow to break it to its readers that there was less to Trump’s ties to Russia than they were hoping, and more to Hunter Biden’s laptop, that Trump might be right that covid came from a Chinese lab, that masks were not always effective against the virus, that shutting down schools for many months was a bad idea.
  • there has been a sea change over the past ten years in how journalists think about pursuing justice. The reporters’ creed used to have its foundation in liberalism, in the classic philosophical sense. The exercise of a reporter’s curiosity and empathy, given scope by the constitutional protections of free speech, would equip readers with the best information to form their own judgments. The best ideas and arguments would win out
  • The journalist’s role was to be a sworn witness; the readers’ role was to be judge and jury. In its idealised form, journalism was lonely, prickly, unpopular work, because it was only through unrelenting scepticism and questioning that society could advance. If everyone the reporter knew thought X, the reporter’s role was to ask: why X?
  • Illiberal journalists have a different philosophy, and they have their reasons for it. They are more concerned with group rights than individual rights, which they regard as a bulwark for the privileges of white men. They have seen the principle of  free speech used to protect right-wing outfits like Project Veritas and Breitbart News and are uneasy with it.
  • They had their suspicions of their fellow citizens’ judgment confirmed by Trump’s election, and do not believe readers can be trusted with potentially dangerous ideas or facts. They are not out to achieve social justice as the knock-on effect of pursuing truth; they want to pursue it head-on
  • The term “objectivity” to them is code for ignoring the poor and weak and cosying up to power, as journalists often have done.
  • And they do not just want to be part of the cool crowd. They need to be
  • To be more valued by their peers and their contacts – and hold sway over their bosses – they need a lot of followers in social media. That means they must be seen to applaud the right sentiments of the right people in social media
  • The journalist from central casting used to be a loner, contrarian or a misfit. Now journalism is becoming another job for joiners, or, to borrow Twitter’s own parlance, “followers”, a term that mocks the essence of a journalist’s role.
  • The new newsroom ideology seems idealistic, yet it has grown from cynical roots in academia: from the idea that there is no such thing as objective truth; that there is only narrative, and that therefore whoever controls the narrative – whoever gets to tell the version of the story that the public hears – has the whip hand
  • What matters, in other words, is not truth and ideas in themselves, but the power to determine both in the public mind.
  • By contrast, the old newsroom ideology seems cynical on its surface. It used to bug me that my editors at the Times assumed every word out of the mouth of any person in power was a lie.
  • And the pursuit of objectivity can seem reptilian, even nihilistic, in its abjuration of a fixed position in moral contests. But the basis of that old newsroom approach was idealistic: the notion that power ultimately lies in truth and ideas, and that the citizens of a pluralistic democracy, not leaders of any sort, must be trusted to judge both.
  • Our role in Times Opinion, I used to urge my colleagues, was not to tell people what to think, but to help them fulfil their desire to think for themselves.
  • It seems to me that putting the pursuit of truth, rather than of justice, at the top of a publication’s hierarchy of values also better serves not just truth but justice, too
  • over the long term journalism that is not also sceptical of the advocates of any form of justice and the programmes they put forward, and that does not struggle honestly to understand and explain the sources of resistance,
  • will not assure that those programmes will work, and it also has no legitimate claim to the trust of reasonable people who see the world very differently. Rather than advance understanding and durable change, it provokes backlash.
  • The impatience within the newsroom with such old ways was intensified by the generational failure of the Times to hire and promote women and non-white people
  • Pay attention if you are white at the Times and you will hear black editors speak of hiring consultants at their own expense to figure out how to get white staff to respect them
  • As wave after wave of pain and outrage swept through the Times, over a headline that was not damning enough of Trump or someone’s obnoxious tweets, I came to think of the people who were fragile, the ones who were caught up in Slack or Twitter storms, as people who had only recently discovered that they were white and were still getting over the shock.
  • Having concluded they had got ahead by working hard, it has been a revelation to them that their skin colour was not just part of the wallpaper of American life, but a source of power, protection and advancement.
  • I share the bewilderment that so many people could back Trump, given the things he says and does, and that makes me want to understand why they do: the breadth and diversity of his support suggests not just racism is at work. Yet these elite, well-meaning Times staff cannot seem to stretch the empathy they are learning to extend to people with a different skin colour to include those, of whatever race, who have different politics.
  • The digital natives were nevertheless valuable, not only for their skills but also because they were excited for the Times to embrace its future. That made them important allies of the editorial and business leaders as they sought to shift the Times to digital journalism and to replace staff steeped in the ways of print. Partly for that reason, and partly out of fear, the leadership indulged internal attacks on Times journalism, despite pleas from me and others, to them and the company as a whole, that Times folk should treat each other with more respect
  • My colleagues and I in Opinion came in for a lot of the scorn, but we were not alone. Correspondents in the Washington bureau and political reporters would take a beating, too, when they were seen as committing sins like “false balance” because of the nuance in their stories.
  • My fellow editorial and commercial leaders were well aware of how the culture of the institution had changed. As delighted as they were by the Times’s digital transformation they were not blind to the ideological change that came with it. They were unhappy with the bullying and group-think; we often discussed such cultural problems in the weekly meetings of the executive committee, composed of the top editorial and business leaders, including the publisher. Inevitably, these bitch sessions would end with someone saying a version of: “Well, at some point we have to tell them this is what we believe in as a newspaper, and if they don’t like it they should work somewhere else.” It took me a couple of years to realise that this moment was never going to come.
  • There is a lot not to miss about the days when editors like Boyd could strike terror in young reporters like me and Purdum. But the pendulum has swung so far in the other direction that editors now tremble before their reporters and even their interns. “I miss the old climate of fear,” Baquet used to say with a smile, in another of his barbed jokes.
  • I wish I’d pursued my point and talked myself out of the job. This contest over control of opinion journalism within the Times was not just a bureaucratic turf battle (though it was that, too)
  • The newsroom’s embrace of opinion journalism has compromised the Times’s independence, misled its readers and fostered a culture of intolerance and conformity.
  • The Opinion department is a relic of the era when the Times enforced a line between news and opinion journalism.
  • Editors in the newsroom did not touch opinionated copy, lest they be contaminated by it, and opinion journalists and editors kept largely to their own, distant floor within the Times building. Such fastidiousness could seem excessive, but it enforced an ethos that Times reporters owed their readers an unceasing struggle against bias in the news
  • But by the time I returned as editorial-page editor, more opinion columnists and critics were writing for the newsroom than for Opinion. As at the cable news networks, the boundaries between commentary and news were disappearing, and readers had little reason to trust that Times journalists were resisting rather than indulging their biases
  • The Times newsroom had added more cultural critics, and, as Baquet noted, they were free to opine about politics.
  • Departments across the Times newsroom had also begun appointing their own “columnists”, without stipulating any rules that might distinguish them from columnists in Opinion
  • I checked to see if, since I left the Times, it had developed guidelines explaining the difference, if any, between a news columnist and opinion columnist. The paper’s spokeswoman, Danielle Rhoades Ha, did not respond to the question.)
  • The internet rewards opinionated work and, as news editors felt increasing pressure to generate page views, they began not just hiring more opinion writers but also running their own versions of opinionated essays by outside voices – historically, the province of Opinion’s op-ed department.
  • Yet because the paper continued to honour the letter of its old principles, none of this work could be labelled “opinion” (it still isn’t). After all, it did not come from the Opinion department.
  • And so a newsroom technology columnist might call for, say, unionisation of the Silicon Valley workforce, as one did, or an outside writer might argue in the business section for reparations for slavery, as one did, and to the average reader their work would appear indistinguishable from Times news articles.
  • By similarly circular logic, the newsroom’s opinion journalism breaks another of the Times’s commitments to its readers. Because the newsroom officially does not do opinion – even though it openly hires and publishes opinion journalists – it feels free to ignore Opinion’s mandate to provide a diversity of views
  • When I was editorial-page editor, there were a couple of newsroom columnists whose politics were not obvious. But the other newsroom columnists, and the critics, read as passionate progressives.
  • I urged Baquet several times to add a conservative to the newsroom roster of cultural critics. That would serve the readers by diversifying the Times’s analysis of culture, where the paper’s left-wing bias had become most blatant, and it would show that the newsroom also believed in restoring the Times’s commitment to taking conservatives seriously. He said this was a good idea, but he never acted on it
  • I couldn’t help trying the idea out on one of the paper’s top cultural editors, too: he told me he did not think Times readers would be interested in that point of view.
  • opinion was spreading through the newsroom in other ways. News desks were urging reporters to write in the first person and to use more “voice”, but few newsroom editors had experience in handling that kind of journalism, and no one seemed certain where “voice” stopped and “opinion” began
  • The Times magazine, meanwhile, became a crusading progressive publication
  • Baquet liked to say the magazine was Switzerland, by which he meant that it sat between the newsroom and Opinion. But it reported only to the news side. Its work was not labelled as opinion and it was free to omit conservative viewpoints.
  • his creep of politics into the newsroom’s journalism helped the Times beat back some of its new challengers, at least those on the left
  • Competitors like Vox and the HuffPost were blending leftish politics with reporting and writing it up conversationally in the first person. Imitating their approach, along with hiring some of their staff, helped the Times repel them. But it came at a cost. The rise of opinion journalism over the past 15 years changed the newsroom’s coverage and its culture
  • The tiny redoubt of never-Trump conservatives in Opinion is swamped daily not only by the many progressives in that department but their reinforcements among the critics, columnists and magazine writers in the newsroom
  • They are generally excellent, but their homogeneity means Times readers are being served a very restricted range of views, some of them presented as straight news by a publication that still holds itself out as independent of any politics.
  • And because the critics, newsroom columnists and magazine writers are the newsroom’s most celebrated journalists, they have disproportionate influence over the paper’s culture.
  • By saying that it still holds itself to the old standard of strictly separating its news and opinion journalists, the paper leads its readers further into the trap of thinking that what they are reading is independent and impartial – and this misleads them about their country’s centre of political and cultural gravity.
  • And yet the Times insists to the public that nothing has changed.
  • “Even though each day’s opinion pieces are typically among our most popular journalism and our columnists are among our most trusted voices, we believe opinion is secondary to our primary mission of reporting and should represent only a portion of a healthy news diet,” Sulzberger wrote in the Columbia Journalism Review. “For that reason, we’ve long kept the Opinion department intentionally small – it represents well under a tenth of our journalistic staff – and ensured that its editorial decision-making is walled off from the newsroom.”
  • When I was editorial-page editor, Sulzberger, who declined to be interviewed on the record for this article, worried a great deal about the breakdown in the boundaries between news and opinion
  • He told me once that he would like to restructure the paper to have one editor oversee all its news reporters, another all its opinion journalists and a third all its service journalists, the ones who supply guidance on buying gizmos or travelling abroad. Each of these editors would report to him
  • That is the kind of action the Times needs to take now to confront its hypocrisy and begin restoring its independence.
  • The Times could learn something from the Wall Street Journal, which has kept its journalistic poise
  • It has maintained a stricter separation between its news and opinion journalism, including its cultural criticism, and that has protected the integrity of its work.
  • After I was chased out of the Times, Journal reporters and other staff attempted a similar assault on their opinion department. Some 280 of them signed a letter listing pieces they found offensive and demanding changes in how their opinion colleagues approached their work. “Their anxieties aren’t our responsibility,” shrugged the Journal’s editorial board in a note to readers after the letter was leaked. “The signers report to the news editors or other parts of the business.” The editorial added, in case anyone missed the point, “We are not the New York Times.” That was the end of it.
  • Unlike the publishers of the Journal, however, Sulzberger is in a bind, or at least perceives himself to be
  • The confusion within the Times over its role, and the rising tide of intolerance among the reporters, the engineers, the business staff, even the subscribers – these are all problems he inherited, in more ways than one. He seems to feel constrained in confronting the paper’s illiberalism by the very source of his authority
  • The paradox is that in previous generations the Sulzbergers’ control was the bulwark of the paper’s independence.
  • if he is going to instil the principles he believes in, he needs to stop worrying so much about his powers of persuasion, and start using the power he is so lucky to have.
  • Shortly after we published the op-ed that Wednesday afternoon, some reporters tweeted their opposition to Cotton’s argument. But the real action was in the Times’s Slack channels, where reporters and other staff began not just venting but organising. They turned to the union to draw up a workplace complaint about the op-ed.
  • The next day, this reporter shared the byline on the Times story about the op-ed. That article did not mention that Cotton had distinguished between “peaceful, law-abiding protesters” and “rioters and looters”. In fact, the first sentence reported that Cotton had called for “the military to suppress protests against police violence”.
  • This was – and is – wrong. You don’t have to take my word for that. You can take the Times’s
  • Three days later in its article on my resignation it also initially reported that Cotton had called “for military force against protesters in American cities”. This time, after the article was published on the Times website, the editors scrambled to rewrite it, replacing “military force” with “military response” and “protesters” with “civic unrest”
  • That was a weaselly adjustment – Cotton wrote about criminality, not “unrest” – but the article at least no longer unambiguously misrepresented Cotton’s argument to make it seem he was in favour of crushing democratic protest. The Times did not publish a correction or any note acknowledging the story had been changed.
  • Seeking to influence the outcome of a story you cover, particularly without disclosing that to the reader, violates basic principles I was raised on at the Times
  • s Rhoades Ha disputes my characterisation of the after-the-fact editing of the story about my resignation. She said the editors changed the story after it was published on the website in order to “refine” it and “add context”, and so the story did not merit a correction disclosing to the reader that changes had been made.
  • In retrospect what seems almost comical is that as the conflict over Cotton’s op-ed unfolded within the Times I acted as though it was on the level, as though the staff of the Times would have a good-faith debate about Cotton’s piece and the decision to publish it
  • Instead, people wanted to vent and achieve what they considered to be justice, whether through Twitter, Slack, the union or the news pages themselves
  • My colleagues in Opinion, together with the PR team, put together a series of connected tweets describing the purpose behind publishing Cotton’s op-ed. Rather than publish these tweets from the generic Times Opinion Twitter account, Sulzberger encouraged me to do it from my personal one, on the theory that this would humanise our defence. I doubted that would make any difference, but it was certainly my job to take responsibility. So I sent out the tweets, sticking my head in a Twitter bucket that clangs, occasionally, to this day
  • What is worth recalling now from the bedlam of the next two days? I suppose there might be lessons for someone interested in how not to manage a corporate crisis. I began making my own mistakes that Thursday. The union condemned our publication of Cotton, for supposedly putting journalists in danger, claiming that he had called on the military “to ‘detain’ and ‘subdue’ Americans protesting racism and police brutality” – again, a misrepresentation of his argument. The publisher called to tell me the company was experiencing its largest sick day in history; people were turning down job offers because of the op-ed, and, he said, some people were quitting. He had been expecting for some time that the union would seek a voice in editorial decision-making; he said he thought this was the moment the union was making its move. He had clearly changed his own mind about the value of publishing the Cotton op-ed.
  • I asked Dao to have our fact-checkers review the union’s claims. But then I went a step further: at the publisher’s request, I urged him to review the editing of the piece itself and come back to me with a list of steps we could have taken to make it better. Dao’s reflex – the correct one – was to defend the piece as published. He and three other editors of varying ages, genders and races had helped edit it; it had been fact-checked, as is all our work
  • This was my last failed attempt to have the debate within the Times that I had been seeking for four years, about why it was important to present Times readers with arguments like Cotton’s. The staff at the paper never wanted to have that debate. The Cotton uproar was the most extreme version of the internal reaction we faced whenever we published conservative arguments that were not simply anti-Trump. Yes, yes, of course we believe in the principle of publishing diverse views, my Times colleagues would say, but why this conservative? Why this argument?
  • I doubt these changes would have mattered, and to extract this list from Dao was to engage in precisely the hypocrisy I claimed to despise – that, in fact, I do despise. If Cotton needed to be held to such standards of politesse, so did everyone else. Headlines such as “Tom Cotton’s Fascist Op-ed”, the headline of a subsequent piece, should also have been tranquillised.
  • As that miserable Thursday wore on, Sulzberger, Baquet and I held a series of Zoom meetings with reporters and editors from the newsroom who wanted to discuss the op-ed. Though a handful of the participants were there to posture, these were generally constructive conversations. A couple of people, including Baquet, even had the guts to speak up in favour of publishing the op-ed
  • Two moments stick out. At one point, in answer to a question, Sulzberger and Baquet both said they thought the op-ed – as the Times union and many journalists were saying – had in fact put journalists in danger. That was the first time I realised I might be coming to the end of the road.
  • The other was when a pop-culture reporter asked if I had read the op-ed before it was published. I said I had not. He immediately put his head down and started typing, and I should have paid attention rather than moving on to the next question. He was evidently sharing the news with the company over Slack.
  • Every job review I had at the Times urged me to step back from the daily coverage to focus on the long term. (Hilariously, one review, urging me to move faster in upending the Opinion department, instructed me to take risks and “ask for forgiveness not permission”.)
  • I learned when these meetings were over that there had been a new eruption in Slack. Times staff were saying that Rubenstein had been the sole editor of the op-ed. In response, Dao had gone into Slack to clarify to the entire company that he had also edited it himself. But when the Times posted the news article that evening, it reported, “The Op-Ed was edited by Adam Rubenstein” and made no mention of Dao’s statement
  • Early that morning, I got an email from Sam Dolnick, a Sulzberger cousin and a top editor at the paper, who said he felt “we” – he could have only meant me – owed the whole staff “an apology for appearing to place an abstract idea like open debate over the value of our colleagues’ lives, and their safety”. He was worried that I and my colleagues had unintentionally sent a message to other people at the Times that: “We don’t care about their full humanity and their security as much as we care about our ideas.”
  • “I know you don’t like it when I talk about principles at a moment like this,” I began. But I viewed the journalism I had been doing, at the Times and before that at the Atlantic, in very different terms from the ones Dolnick presumed. “I don’t think of our work as an abstraction without meaning for people’s lives – quite the opposite,” I continued. “The whole point – the reason I do this – is to have an impact on their lives to the good. I have always believed that putting ideas, including potentially dangerous one[s], out in the public is vital to ensuring they are debated and, if dangerous, discarded.” It was, I argued, in “edge cases like this that principles are tested”, and if my position was judged wrong then “I am out of step with the times.” But, I concluded, “I don’t think of us as some kind of debating society without implications for the real world and I’ve never been unmindful of my colleagues’ humanity.”
  • in the end, one thing he and I surely agree on is that I was, in fact, out of step with the Times. It may have raised me as a journalist – and invested so much in educating me to what were once its standards – but I did not belong there any more.
  • Finally, I came up with something that felt true. I told the meeting that I was sorry for the pain that my leadership of Opinion had caused. What a pathetic thing to say. I did not think to add, because I’d lost track of this truth myself by then, that opinion journalism that never causes pain is not journalism. It can’t hope to move society forward
  • As I look back at my notes of that awful day, I don’t regret what I said. Even during that meeting, I was still hoping the blow-up might at last give me the chance either to win support for what I had been asked to do, or to clarify once and for all that the rules for journalism had changed at the Times.
  • But no one wanted to talk about that. Nor did they want to hear about all the voices of vulnerable or underprivileged people we had been showcasing in Opinion, or the ambitious new journalism we were doing. Instead, my Times colleagues demanded to know things such as the names of every editor who had had a role in the Cotton piece. Having seen what happened to Rubenstein I refused to tell them. A Slack channel had been set up to solicit feedback in real time during the meeting, and it was filling with hate. The meeting ran long, and finally came to a close after 90 minutes.
  • I tried to insist, as did Dao, that the note make clear the Cotton piece was within our editorial bounds. Sulzberger said he felt the Times could afford to be “silent” on that question. In the end the note went far further in repudiating the piece than I anticipated, saying it should never have been published at all. The next morning I was told to resign.
  • It was a terrible moment for the country. By the traditional – and perverse – logic of journalism, that should also have made it an inspiring time to be a reporter, writer or editor. Journalists are supposed to run towards scenes that others are fleeing, towards hard truths others need to know, towards consequential ideas they would prefer to ignore.
  • But fear got all mixed up with anger inside the Times, too, along with a desire to act locally in solidarity with the national movement. That energy found a focus in the Cotton op-ed
  • the Times is not good at acknowledging mistakes. Indeed, one of my own, within the Times culture, was to take responsibility for any mistakes my department made, and even some it didn’t
  • To Sulzberger, the meltdown over Cotton’s op-ed and my departure in disgrace are explained and justified by a failure of editorial “process”. As he put it in an interview with the New Yorker this summer, after publishing his piece in the Columbia Journalism Review, Cotton’s piece was not “perfectly fact-checked” and the editors had not “thought about the headline and presentation”. He contrasted the execution of Cotton’s opinion piece with that of a months-long investigation the newsroom did of Donald Trump’s taxes (which was not “perfectly fact-checked”, as it happens – it required a correction). He did not explain why, if the Times was an independent publication, an op-ed making a mainstream conservative argument should have to meet such different standards from an op-ed making any other kind of argument, such as for the abolition of the police
  • “It’s not enough just to have the principle and wave it around,” he said. “You also have to execute on it.”
  • To me, extolling the virtue of independent journalism in the pages of the Columbia Journalism Review is how you wave a principle around. Publishing a piece like Cotton’s is how you execute on it.
  • As Sulzberger also wrote in the Review, “Independent journalism, especially in a pluralistic democracy, should err on the side of treating areas of serious political contest as open, unsettled, and in need of further inquiry.
  • If Sulzberger must insist on comparing the execution of the Cotton op-ed with that of the most ambitious of newsroom projects, let him compare it with something really important, the 1619 Project, which commemorated the 400th anniversary of the arrival of enslaved Africans in Virginia.
  • Like Cotton’s piece, the 1619 Project was fact-checked and copy-edited (most of the Times newsroom does not fact-check or copy-edit articles, but the magazine does). But it nevertheless contained mistakes, as journalism often does. Some of these mistakes ignited a firestorm among historians and other readers.
  • And, like Cotton’s piece, the 1619 Project was presented in a way the Times later judged to be too provocative.
  • The Times declared that the 1619 Project “aims to reframe the country’s history, understanding 1619 as our true founding”. That bold statement – a declaration of Times fact, not opinion, since it came from the newsroom – outraged many Americans who venerated 1776 as the founding. The Times later stealthily erased it from the digital version of the project, but was caught doing so by a writer for the publication Quillette. Sulzberger told me during the initial uproar that the top editors in the newsroom – not just Baquet but his deputy – had not reviewed the audacious statement of purpose, one of the biggest editorial claims the paper has ever made. They also, of course, did not edit all the pieces themselves, trusting the magazine’s editors to do that work.
  • If the 1619 Project and the Cotton op-ed shared the same supposed flaws and excited similar outrage, how come that one is lauded as a landmark success and the other is a sackable offence?
  • I am comparing them only to meet Sulzberger on his terms, in order to illuminate what he is trying to elide. What distinguished the Cotton piece was not an error, or strong language, or that I didn’t edit it personally. What distinguished that op-ed was not process. It was politics.
  • It is one thing for the Times to aggravate historians, or conservatives, or even old-school liberals who believe in open debate. It has become quite another for the Times to challenge some members of its own staff with ideas that might contradict their view of the world.
  • The lessons of the incident are not about how to write a headline but about how much the Times has changed – how digital technology, the paper’s new business model and the rise of new ideals among its staff have altered its understanding of the boundary between news and opinion, and of the relationship between truth and justice
  • Ejecting me was one way to avoid confronting the question of which values the Times is committed to. Waving around the word “process” is another.
  • As he asserts the independence of Times journalism, Sulzberger is finding it necessary to reach back several years to another piece I chose to run, for proof that the Times remains willing to publish views that might offend its staff. “We’ve published a column by the head of the part of the Taliban that kidnapped one of our own journalists,” he told the New Yorker. He is missing the real lesson of that piece, as well.
  • The case against that piece is that Haqqani, who remains on the FBI’s most-wanted terrorist list, may have killed Americans. It’s puzzling: in what moral universe can it be a point of pride to publish a piece by an enemy who may have American blood on his hands, and a matter of shame to publish a piece by an American senator arguing for American troops to protect Americans?
  • As Mitch McConnell, then the majority leader, said on the Senate floor about the Times’s panic over the Cotton op-ed, listing some other debatable op-ed choices, “Vladimir Putin? No problem. Iranian propaganda? Sure. But nothing, nothing could have prepared them for 800 words from the junior senator from Arkansas.”
  • The Times’s staff members are not often troubled by obnoxious views when they are held by foreigners. This is an important reason the paper’s foreign coverage, at least of some regions, remains exceptional.
  • What seems most important and least understood about that episode is that it demonstrated in real time the value of the ideals that I poorly defended in the moment, ideals that not just the Times’s staff but many other college-educated Americans are abandoning.
  • After all, we ran the experiment; we published the piece. Was any Times journalist hurt? No. Nobody in the country was. In fact, though it is impossible to know the op-ed’s precise effect, polling showed that support for a military option dropped after the Times published the essay, as the Washington Post’s media critic, Erik Wemple, has written
  • If anything, in other words, publishing the piece stimulated debate that made it less likely Cotton’s position would prevail. The liberal, journalistic principle of open debate was vindicated in the very moment the Times was fleeing from it.
Javier E

The Cutthroat World of Elite Public Schools - The Atlantic - 0 views

  • The issue at hand was—and still is—the city’s nine elite public high schools. Like most public high schools in the city, these schools can choose who attends. But the elite schools are their own animal: Whereas other schools look at a range of criteria to determine students’ eligibility, eight of these nine elite institutions admit applicants based exclusively on how the students score on a rigorous, two-and-a-half-hour-long standardized test.
  • The test-only admissions policy is touted by supporters as a tactic that promotes fairness and offers the best way to identify the city’s most gifted students. But the complaint, which is still pending, tells a different story—one of modern-day segregation, in which poor kids of color are getting left behind.
  • Public schools in cities across the country—schools intended to break down the walls typical of expensive, elite private institutions by opening up access to stimulating, quality education for kids of all means—are closed in their admissions. In other words, kids aren’t just automatically enrolled because they live in the neighborhood—they have to apply to get in
  • ...24 more annotations...
  • As a result, their student populations are often far less diverse than they should be. And, sometimes, kids who would otherwise be eligible for these schools never get to enjoy them.
  • The country, he discovered, is home to some 165 of these institutions—"exam schools," as he calls them—or 1 percent of all public high schools.
  • econdly, selective-enrollment schools "are very sought after by upper-middle class people who might not consider using public schools if it weren’t for the selective-enrollment institutions. Essentially, it’s a way of ensuring greater participation from wealthier families who might otherwise move to the suburbs."
  • Selective-admissions programs are in part symptomatic of a broader, three-decade-old reform movement that has aimed to overcome the "mediocre educational performance" of the country’s students
  • They’re also an example of "school choice," the tenet that parents should have options when it comes to their kids’ education, even when it’s free.
  • "The idea was that, if you wanted to provide an excellent, gifted, and talented education for public school students, one could do a better job of that if in large cities there were specialized schools that would bring academically talented students together,"
  • These schools, some of which are centuries old, are concentrated in 31 states, including nearly three dozen total in New York City, Chicago, and Boston alone. All but three of these 31 states are located in the eastern half of the country,
  • "the trick," he said, "is you don’t want the selective-enrollment schools to become enclaves of privilege that are separate and unequal from the rest of the system."
  • getting into selective-enrollment schools typically requires having proactive parents who know how to navigate the system—a resource many children lack.
  • The clashes over selective-admissions policies reflect the challenges districts face in reconciling two goals that are often diametrically opposed: academic achievement and equity. How can a school be color blind while simultaneously promoting educational access and diversity?
  • "How do you recognize excellence on the one hand and promote genuine equal opportunity on the other?"
  • Can a fair selective-admissions system for public schools even exist?
  • urban school districts are nowhere near coming up with a model that works well and raises all students. The fact remains that many of these schools look and operate like elite schools exclusive to elite families.
  • These are schools renowned for their academic prowess and widely seen as conduits to the country’s top colleges. But, as the NAACP complaint demonstrates, they’re also notorious for their lack of racial diversity, enrolling disproportionate numbers of white and, in particular, Asian students, who made up 60 percent of the student bodies at these schools last year despite constituting just 15 percent of the city’s total enrollment.
  • Blacks and Latinos made up just 7 percent and 5 percent of the student bodies at these elite schools last year, respectively, even though the two groups together account for 70 percent of the public school population citywide.
  • many of New York City’s specialized high schools are more socioeconomically diverse than critics make them out to be.
  • "It’s not just a simple picture—there’s no one profile in this city," she said. "Those [test-only] schools are serving some first-generation strivers and working-class strivers that some of these other schools are not taking …
  • it’s hard to deny arguments that the test-only admissions policy can serve as a form of de facto discrimination. The multiple-choice exam is so rigorous some students devote entire summers to studying for it, often with the help of private tutors or intensive prep courses that cost thousands of dollars
  • much of the prejudice traces back to the lack of equal educational opportunity in kids’ earlier years, which effectively debunks the notion that a test is the fairest way to assess a student’s eligibility for enrollment.
  • When it comes to admission to one of the selective schools, most students only compete with their peers in the same tier. A student who lives in a single-parent household and relies on welfare, for example, would in theory rarely contend with a middle-class student for the same seat. Just 30 percent of the seats at each selective school goes to the highest-scoring students, regardless of their tier; the rest, for the most part, are divided among the highest-performing students in each tier. That means the bar is typically set higher for kids in the upper tiers (the fourth tier corresponds with the highest median income) than for those in the lower ones.
  • "Given the overlap between race and class in American society in cities like Chicago, giving a leg up to economically disadvantaged students will translate into [racial diversity],
  • Diversity aside, selective-enrollment high schools also raise questions about what the admissions process can do to an adolescent’s psyche, particularly when it places an inordinate emphasis on testing
  • Forget Halloween, weekend sleepovers with friends, playing outdoors. For many eighth graders in New York City, the fall is synonymous with tutors and exams, while the spring brings intense competition—and often volatile emotions—over placement in coveted spots at the city’s best high schools.
  • As for the students, "you’re given a cornucopia of beautiful and horrible choices and then held up, feeling like you’re being assessed and placed and feeling like your life is not your own," Szuflita said. "It feels very uncertain, and it feels like there are great triumphs and disasters."
Javier E

Why conservative magazines are more important than ever - The Washington Post - 0 views

  • political magazines, of any persuasion, can be at their worst when ideological team spirit is strongest.
  • For conservative magazines, the years after Sept. 11, 2001, when patriotism seemed to demand loyalty to the White House, were such a time. “We did allow ourselves to become house organs for the Republican Party and the conservative movement,” says American Conservative blogger Rod Dreher, who worked at National Review from 2002 to 2003. “I would have denied it at the time, but that really happened.”
  • This also made many conservatives reluctant to confront the flaws of George W. Bush, even years after his presidency. “What did we think about compassionate conservatism? About No Child Left Behind? About the Iraq War? The truth is a lot of conservatives thought they were basically a mistake and badly considered,”
  • ...38 more annotations...
  • I think that was something the right had failed to wrestle with. They hadn’t had those conversations.”
  • right-of-center magazines have been debating and reassessing the soul of their political philosophy. Trumpism has torn down the conservative house and broken it up for parts. Conservative magazines are working to bring a plausible intellectual order to this new reality — and figure out what comes next.
  • “I’ve been a big critic of mainstream-media ideological blinders and biases, and I still am,” he said. “But we also have a president who lies aggressively, who lies casually, who lies about things that matter in huge ways and about things that don’t matter at all.”
  • Goldberg and writers Jay Nordlinger and Kevin D. Williamson are perhaps the most conspicuous members of National Review’s anti-Trump camp.
  • Hayes hopes that when a reader of the liberal magazine the Nation or a watcher of MSNBC seeks out an “intellectually honest conservative take,” that person will go to the Weekly Standard.
  • Kristol said he was reluctant to assess the present-day magazine as a whole, but he agreed that such a change was possible. “I feel now like I was unconsciously constraining the ways I was thinking,” he said. “You had friends. You had allies. You didn’t want to look too closely at the less savory parts of them.”
  • While the Weekly Standard has generally reflected a conventionally hawkish Republican worldview, it has also been willing to entertain varying political outlooks, with its writers landing in different places on Trump and many other matters. Labash, for instance, never hid his opposition to the war in Iraq. “It’s a magazine, not a cult,” he says. “You’re free to think freely.”
  • Goldberg told me that he had been spared any pressure from his employers to line up with the White House — “not a peep from a soul” at AEI or National Review — but that other employers were less tolerant. “One of the things I have much less respect for is Conservatism Inc.,” he said. “When the real histories of this period are done, one of the more important points is that institutions, both in the media and the think- tank universe, that are dependent on really large donor bases, they were among the first to give way.”
  • In response, Hayes has increased the magazine’s focus on reporting, he said, less for the purpose of winning debates than to rescue a sense of shared premises. “We thought it was important to focus on reporting and facts and try to determine what the facts are, so that we can have a big debate about policies we should pursue as a country based on a common understanding of those facts,”
  • Krein seemed more sanguine than most conservative intellectuals I met, viewing the changed policy discourse as a good in itself. “We have an honest question — what the role of the nation-state is,” Krein said. “This is a world of nation-states, but we no longer have any positive rationale for them. Those questions need to be worked out.”
  • Most of the magazine’s writers are somewhere in between. “We have a number of writers who are vehemently anti-Trump; I’m one of them,” says National Review Online editor Charles C.W. Cooke. “That doesn’t mean he can’t do anything right. That would be to throw my brain away.”
  • “One of the giant ironies of this whole phenomenon for us is that Trump represents a cartoonish, often exaggerated, version of the direction we wanted to see the party go in,” Lowry said. “Trump was in a very different place on regulation and trade, but we had been widening the lens of mainstream conservatism and arguing that the party needed to be more populist.”
  • “National Review has absolutely become more interesting,” says Helen Andrews, an essayist who has written for nearly all of the publications mentioned in this article. “When Trump won, I thought that’s it. National Review is done. There’s no way they can bounce back. But it turns out that all the folks over there that I thought were peacetime consiglieres were actually ready to seize the moment.”
  • Other contributors, like Dennis Prager and Victor Davis Hanson, reliably line up behind Trump, arguing he’s the only defense against an overpowering left.
  • Merry’s hope, in the face of what he feels are increasingly unfavorable odds, is that Trump will fulfill some of his promises. Assessing that will be one of the main goals of the magazine in the coming years. “We’re interested in the Trump constituency,” Merry says. “The question for us is whether Trump is proving worthy of his voters.”
  • One curse of the American Conservative, starting with Iraq, has been to serve as an unheeded voice in the face of indifferent or hostile elite opinion. In 2011, Larison was sounding repeated warnings against intervening in Libya, and for several years, before more famous names took notice, he was a lonely voice against the Saudi war in Yemen.
  • Back in June 2016, the magazine ran a cover story by McConnell, “Why Trump Wins,” which argued that globalism vs. nationalism was the new defining issue in our politics and that GOP elites would be unable to “put the lid on the aspirations Trump has unleashed.”
  • A sense of the political power of cultural conversations likewise inspired former Senate staffer Ben Domenech, now 36, to launch the Federalist in the fall of 2013
  • Each of them is playing a distinct role on the right.
  • Modern Age, founded by conservative luminary Russell Kirk in 1957 and operated by the Intercollegiate Studies Institute, takes what may be the most high-toned approach to politics, with many academic contributors, and McCarthy hopes to see its pages synthesizing ideas from different strains of conservatism.
  • The National Interest, co-founded in 1985 by the late Irving Kristol, father of Bill, remains devoted to foreign-policy realism, offering thoughtful articles on what role the United States should play on a changed world stage.
  • National Affairs, founded by former George W. Bush policy staffer Yuval Levin in 2009 as a venue in which conservative policy could be considered more deeply, spent the Obama years offering broad philosophical articles along with wonkier explorations of policymaking, from housing to public broadcasting. This continues, but after the rise of Trump, the journal has become even more introspective, running articles with titles like “Redeeming Ourselves” and “Is the Party Over?”
  • These publications are highly unlikely to affect the course of Trump, but, by making plausible sense of this moment sooner rather than later, they may affect the course of his successors.
  • Two surprising stars of the Trump era have been the Claremont Review of Books and the religious journal First Things. It was in the normally restrained Claremont Review of Books that someone going by the name “Publius Decius Mus” (later revealed to be Michael Anton) published “The Flight 93 Election,” an influential essay arguing that the election of Trump, however extreme the risks, was the only hope of preventing a complete surrender to the cultural left.
  • The trajectory of First Things, a journal of religion and public life founded in 1990, has been even more striking. Its editor, R.R. Reno, contributed to the “Against Trump” issue of National Review but became increasingly frustrated by what he felt was the failure of his fellow conservatives to understand the nature of the rebellion taking place. Eventually, Reno wound up signing on to a “Statement of Unity” in support of Trump by a group called Scholars & Writers for America. First Things is now devoting itself to understanding the altered political and cultural landscape. “The conservative intellectual infrastructure is like a city after the neutron bomb goes off,” says Reno. “There’s a whole network of ideas, and it turns out there are no voters for those ideas.”
  • The monthly conservative magazine the New Criterion, edited by Roger Kimball, may devote the bulk of its pages to reviews of things like symphonies or art exhibits, but it was also among the first journals to take Trump seriously and understand, as contributor James Bowman put it in October 2015, that Trump spoke for “those whom the progressives have sought to shut out of decent society, which encompasses a much larger universe than that of the movement conservatives.”
  • Commentary, founded in 1945 by the American Jewish Committee — from which it separated in 2007, becoming a stand-alone nonprofit — has always balanced its forays into politics with grander musings on Western civilization, Judaism and high culture. This seems to be a successful combination in the Trump era, because the circulation, according to Podhoretz, has risen by over 20 percent since the 2016 election. Podhoretz, who has edited the magazine since 2009 (his father, Norman Podhoretz, edited it from 1960 to 1995), is known for a prickly and combative approach to public life
  • “It may be that Commentary is uniquely suited to the weirdness of this position because it has been a countercultural publication for close to 50 years. It is a Jewish publication on the right. It is a conservative publication in a liberal Jewish community. It remains a journal with literary, cultural and intellectual interests, which makes it a minority in the world of conservative opinion, which tends not to focus on the life of the cultural mind.”
  • Commentary has had several high-profile articles in the past year. In February 2017, it published “Our Miserable 21st Century,” by Nicholas N. Eberstadt, who argued that the economic insecurity of Americans spiked after 2000 and never recovered.
  • Many of the smallest conservative journals are unadorned and low in circulation. But, in keeping with the rule that what’s in the wilderness today can be most influential tomorrow, they too are awash in fresh ideas. “There’s still a pretty substantial community that relies on these publications as a channel of communications within the conservative neural network,” observes Daniel McCarthy, editor of one such journal, Modern Age. “They’re even more relevant today than they were in 2012.”
  • Domenech told me he started to envision a new kind of conservative opinion site after observing that more and more areas of our culture — movies, talk shows, sports — were becoming politicized.
  • The staff of the Federalist is majority female, half millennial, and a quarter minority, according to Domenech, and youthfulness was reflected in the publication’s design
  • By engaging in pop-culture debates, going on television, and focusing on engagement with writers and voices outside the conservative sphere, the Federalist hopes to reach audiences that might normally be dismissive
  • Conservative magazines, Domenech said, had been mistaken to think they spoke for voters on the right. “This battle was not over whether we’re going to have a Chamber of Commerce agenda or a constitutionalist agenda,” Domenech said. “It left out this huge swath of people who weren’t interested in either of those things.”
  • As much as their contributors may differ in opinion or even dislike one another, what unites these magazines — and distinguishes them from right-wing outlets like Breitbart — is an almost quaint belief in debate as an instrument of enlightenment rather than as a mere tool of political warfare.
  • “There’s an argument on part of the right that the left is utterly remorseless and we need to be like that,” says Lowry. “That’s the way you lose your soul and you have no standards.”
  • “You want to be a revolutionary on the right?” asks Labash. “Tell the truth. Call honest balls and strikes. That’s become pretty revolutionary behavior in these hopelessly tribal times.”
  • With so many Americans today engaged in partisan war, any publication with a commitment to honesty in argument becomes a potential peacemaker. It also becomes an indispensable forum for working out which ideas merit a fight in the first place. This is what, in their best moments, the conservative magazines are now doing.
Javier E

Opinion | When Public Health Loses the Public - The New York Times - 0 views

  • “Within Reason: A Liberal Public Health for an Illiberal Time,” Sandro Galea, the dean of the Boston University School of Public Health, looks to his own field to explain the animating forces behind some of those disputes.
  • Despite remarkable successes, Galea argues, public health succumbed to a disturbing strain of illiberalism during the pandemic. This not only worsened the impact of the pandemic; it also destabilized public health institutions in ways that will serve us poorly when the next crisis comes.
  • : If Americans have come to distrust public health advice, what role may public health officials have played in fostering that distrust?
  • ...6 more annotations...
  • American health experts advocated almost universal child vaccination; meanwhile, in Europe, experts cautioned against vaccinating young children, who were at low risk for serious illness, without more long-term data. “Were we pushing to vaccinate children for their sake or for ours?” Galea asks. “Were we doing it to support health or to make a political point?”
  • Scientists should have made more nuanced risk assessments and revisited them regularly. They should have taken into account the consequences and the disproportionate impact of strict lockdowns on lower-income workers and at-risk youth
  • This zero-sum mode of thinking — neglecting to take into account one’s own biases, succumbing to groupthink, operating according to the expectations of one’s “side,” discouraging good-faith debate — persisted even as the pandemic eased.
  • this tendency to view “core issues in Manichaean terms, with certain positions seen as on the side of good and others on the side of evil, with little gray area between,” as Galea puts it, has continued to inform public health postpandemic
  • It also undermines public faith in science, one of the few institutions that had maintained a high level of trust into the Trump era.
  • the percentage of Americans who believe science has a mostly positive effect on society dropped to 57 percent in 2023, from 67 percent in 2016. Those who say they have a great deal of confidence in scientists dropped to 23 percent, from 39 percent in 2020. And these declines took place among both Republicans and Democrats.
Javier E

He Could Have Seen What Was Coming: Behind Trump's Failure on the Virus - The New York ... - 0 views

  • “Any way you cut it, this is going to be bad,” a senior medical adviser at the Department of Veterans Affairs, Dr. Carter Mecher, wrote on the night of Jan. 28, in an email to a group of public health experts scattered around the government and universities. “The projected size of the outbreak already seems hard to believe.”
  • A week after the first coronavirus case had been identified in the United States, and six long weeks before President Trump finally took aggressive action to confront the danger the nation was facing — a pandemic that is now forecast to take tens of thousands of American lives — Dr. Mecher was urging the upper ranks of the nation’s public health bureaucracy to wake up and prepare for the possibility of far more drastic action.
  • Throughout January, as Mr. Trump repeatedly played down the seriousness of the virus and focused on other issues, an array of figures inside his government — from top White House advisers to experts deep in the cabinet departments and intelligence agencies — identified the threat, sounded alarms and made clear the need for aggressive action.
  • ...68 more annotations...
  • The president, though, was slow to absorb the scale of the risk and to act accordingly, focusing instead on controlling the message, protecting gains in the economy and batting away warnings from senior officials.
  • Mr. Trump’s response was colored by his suspicion of and disdain for what he viewed as the “Deep State” — the very people in his government whose expertise and long experience might have guided him more quickly toward steps that would slow the virus, and likely save lives.
  • The slow start of that plan, on top of the well-documented failures to develop the nation’s testing capacity, left administration officials with almost no insight into how rapidly the virus was spreading. “We were flying the plane with no instruments,” one official said.
  • But dozens of interviews with current and former officials and a review of emails and other records revealed many previously unreported details and a fuller picture of the roots and extent of his halting response as the deadly virus spread:
  • The National Security Council office responsible for tracking pandemics received intelligence reports in early January predicting the spread of the virus to the United States, and within weeks was raising options like keeping Americans home from work and shutting down cities the size of Chicago. Mr. Trump would avoid such steps until March.
  • Despite Mr. Trump’s denial weeks later, he was told at the time about a Jan. 29 memo produced by his trade adviser, Peter Navarro, laying out in striking detail the potential risks of a coronavirus pandemic: as many as half a million deaths and trillions of dollars in economic losses.
  • The health and human services secretary, Alex M. Azar II, directly warned Mr. Trump of the possibility of a pandemic during a call on Jan. 30, the second warning he delivered to the president about the virus in two weeks. The president, who was on Air Force One while traveling for appearances in the Midwest, responded that Mr. Azar was being alarmist
  • Mr. Azar publicly announced in February that the government was establishing a “surveillance” system
  • the task force had gathered for a tabletop exercise — a real-time version of a full-scale war gaming of a flu pandemic the administration had run the previous year. That earlier exercise, also conducted by Mr. Kadlec and called “Crimson Contagion,” predicted 110 million infections, 7.7 million hospitalizations and 586,000 deaths following a hypothetical outbreak that started in China.
  • By the third week in February, the administration’s top public health experts concluded they should recommend to Mr. Trump a new approach that would include warning the American people of the risks and urging steps like social distancing and staying home from work.
  • But the White House focused instead on messaging and crucial additional weeks went by before their views were reluctantly accepted by the president — time when the virus spread largely unimpeded.
  • When Mr. Trump finally agreed in mid-March to recommend social distancing across the country, effectively bringing much of the economy to a halt, he seemed shellshocked and deflated to some of his closest associates. One described him as “subdued” and “baffled” by how the crisis had played out. An economy that he had wagered his re-election on was suddenly in shambles.
  • He only regained his swagger, the associate said, from conducting his daily White House briefings, at which he often seeks to rewrite the history of the past several months. He declared at one point that he “felt it was a pandemic long before it was called a pandemic,” and insisted at another that he had to be a “cheerleader for the country,” as if that explained why he failed to prepare the public for what was coming.
  • Mr. Trump’s allies and some administration officials say the criticism has been unfair.
  • The Chinese government misled other governments, they say. And they insist that the president was either not getting proper information, or the people around him weren’t conveying the urgency of the threat. In some cases, they argue, the specific officials he was hearing from had been discredited in his eyes, but once the right information got to him through other channels, he made the right calls.
  • “While the media and Democrats refused to seriously acknowledge this virus in January and February, President Trump took bold action to protect Americans and unleash the full power of the federal government to curb the spread of the virus, expand testing capacities and expedite vaccine development even when we had no true idea the level of transmission or asymptomatic spread,” said Judd Deere, a White House spokesman.
  • Decision-making was also complicated by a long-running dispute inside the administration over how to deal with China
  • The Containment IllusionBy the last week of February, it was clear to the administration’s public health team that schools and businesses in hot spots would have to close. But in the turbulence of the Trump White House, it took three more weeks to persuade the president that failure to act quickly to control the spread of the virus would have dire consequences.
  • There were key turning points along the way, opportunities for Mr. Trump to get ahead of the virus rather than just chase it. There were internal debates that presented him with stark choices, and moments when he could have chosen to ask deeper questions and learn more. How he handled them may shape his re-election campaign. They will certainly shape his legacy.
  • Facing the likelihood of a real pandemic, the group needed to decide when to abandon “containment” — the effort to keep the virus outside the U.S. and to isolate anyone who gets infected — and embrace “mitigation” to thwart the spread of the virus inside the country until a vaccine becomes available.
  • Among the questions on the agenda, which was reviewed by The New York Times, was when the department’s secretary, Mr. Azar, should recommend that Mr. Trump take textbook mitigation measures “such as school dismissals and cancellations of mass gatherings,” which had been identified as the next appropriate step in a Bush-era pandemic plan.
  • The group — including Dr. Anthony S. Fauci of the National Institutes of Health; Dr. Robert R. Redfield of the Centers for Disease Control and Prevention, and Mr. Azar, who at that stage was leading the White House Task Force — concluded they would soon need to move toward aggressive social distancing
  • A 20-year-old Chinese woman had infected five relatives with the virus even though she never displayed any symptoms herself. The implication was grave — apparently healthy people could be unknowingly spreading the virus — and supported the need to move quickly to mitigation.
  • The following day, Dr. Kadlec and the others decided to present Mr. Trump with a plan titled “Four Steps to Mitigation,” telling the president that they needed to begin preparing Americans for a step rarely taken in United States history.
  • a presidential blowup and internal turf fights would sidetrack such a move. The focus would shift to messaging and confident predictions of success rather than publicly calling for a shift to mitigation.
  • These final days of February, perhaps more than any other moment during his tenure in the White House, illustrated Mr. Trump’s inability or unwillingness to absorb warnings coming at him.
  • He instead reverted to his traditional political playbook in the midst of a public health calamity, squandering vital time as the coronavirus spread silently across the country.
  • A memo dated Feb. 14, prepared in coordination with the National Security Council and titled “U.S. Government Response to the 2019 Novel Coronavirus,” documented what more drastic measures would look like, including: “significantly limiting public gatherings and cancellation of almost all sporting events, performances, and public and private meetings that cannot be convened by phone. Consider school closures. Widespread ‘stay at home’ directives from public and private organizations with nearly 100% telework for some.”
  • his friend had a blunt message: You need to be ready. The virus, he warned, which originated in the city of Wuhan, was being transmitted by people who were showing no symptoms — an insight that American health officials had not yet accepted.
  • On the 18-hour plane ride home, Mr. Trump fumed as he watched the stock market crash after Dr. Messonnier’s comments. Furious, he called Mr. Azar when he landed at around 6 a.m. on Feb. 26, raging that Dr. Messonnier had scared people unnecessarily.
  • The meeting that evening with Mr. Trump to advocate social distancing was canceled, replaced by a news conference in which the president announced that the White House response would be put under the command of Vice President Mike Pence.
  • The push to convince Mr. Trump of the need for more assertive action stalled. With Mr. Pence and his staff in charge, the focus was clear: no more alarmist messages. Statements and media appearances by health officials like Dr. Fauci and Dr. Redfield would be coordinated through Mr. Pence’s office
  • It would be more than three weeks before Mr. Trump would announce serious social distancing efforts, a lost period during which the spread of the virus accelerated rapidly.Over nearly three weeks from Feb. 26 to March 16, the number of confirmed coronavirus cases in the United States grew from 15 to 4,226
  • The China FactorThe earliest warnings about coronavirus got caught in the crosscurrents of the administration’s internal disputes over China. It was the China hawks who pushed earliest for a travel ban. But their animosity toward China also undercut hopes for a more cooperative approach by the world’s two leading powers to a global crisis.
  • It was early January, and the call with a Hong Kong epidemiologist left Matthew Pottinger rattled.
  • Mr. Trump was walking up the steps of Air Force One to head home from India on Feb. 25 when Dr. Nancy Messonnier, the director of the National Center for Immunization and Respiratory Diseases, publicly issued the blunt warning they had all agreed was necessary.
  • It was one of the earliest warnings to the White House, and it echoed the intelligence reports making their way to the National Security Council
  • some of the more specialized corners of the intelligence world were producing sophisticated and chilling warnings.
  • In a report to the director of national intelligence, the State Department’s epidemiologist wrote in early January that the virus was likely to spread across the globe, and warned that the coronavirus could develop into a pandemic
  • Working independently, a small outpost of the Defense Intelligence Agency, the National Center for Medical Intelligence, came to the same conclusion.
  • By mid-January there was growing evidence of the virus spreading outside China. Mr. Pottinger began convening daily meetings about the coronavirus
  • The early alarms sounded by Mr. Pottinger and other China hawks were freighted with ideology — including a push to publicly blame China that critics in the administration say was a distraction
  • And they ran into opposition from Mr. Trump’s economic advisers, who worried a tough approach toward China could scuttle a trade deal that was a pillar of Mr. Trump’s re-election campaign.
  • Mr. Pottinger continued to believe the coronavirus problem was far worse than the Chinese were acknowledging. Inside the West Wing, the director of the Domestic Policy Council, Joe Grogan, also tried to sound alarms that the threat from China was growing.
  • The Consequences of ChaosThe chaotic culture of the Trump White House contributed to the crisis. A lack of planning and a failure to execute, combined with the president’s focus on the news cycle and his preference for following his gut rather than the data cost time, and perhaps lives.
  • the hawks kept pushing in February to take a critical stance toward China amid the growing crisis. Mr. Pottinger and others — including aides to Secretary of State Mike Pompeo — pressed for government statements to use the term “Wuhan Virus.”Mr. Pompeo tried to hammer the anti-China message at every turn, eventually even urging leaders of the Group of 7 industrialized countries to use “Wuhan virus” in a joint statement.
  • Others, including aides to Mr. Pence, resisted taking a hard public line, believing that angering Beijing might lead the Chinese government to withhold medical supplies, pharmaceuticals and any scientific research that might ultimately lead to a vaccine.
  • Mr. Trump took a conciliatory approach through the middle of March, praising the job Mr. Xi was doing.
  • That changed abruptly, when aides informed Mr. Trump that a Chinese Foreign Ministry spokesman had publicly spun a new conspiracy about the origins of Covid-19: that it was brought to China by U.S. Army personnel who visited the country last October.
  • On March 16, he wrote on Twitter that “the United States will be powerfully supporting those industries, like Airlines and others, that are particularly affected by the Chinese Virus.”
  • Mr. Trump’s decision to escalate the war of words undercut any remaining possibility of broad cooperation between the governments to address a global threat
  • Mr. Pottinger, backed by Mr. O’Brien, became one of the driving forces of a campaign in the final weeks of January to convince Mr. Trump to impose limits on travel from China
  • he circulated a memo on Jan. 29 urging Mr. Trump to impose the travel limits, arguing that failing to confront the outbreak aggressively could be catastrophic, leading to hundreds of thousands of deaths and trillions of dollars in economic losses.
  • The uninvited message could not have conflicted more with the president’s approach at the time of playing down the severity of the threat. And when aides raised it with Mr. Trump, he responded that he was unhappy that Mr. Navarro had put his warning in writing.
  • From the time the virus was first identified as a concern, the administration’s response was plagued by the rivalries and factionalism that routinely swirl around Mr. Trump and, along with the president’s impulsiveness, undercut decision making and policy development.
  • Even after Mr. Azar first briefed him about the potential seriousness of the virus during a phone call on Jan. 18 while the president was at his Mar-a-Lago resort in Florida, Mr. Trump projected confidence that it would be a passing problem.
  • “We have it totally under control,” he told an interviewer a few days later while attending the World Economic Forum in Switzerland. “It’s going to be just fine.”
  • The efforts to sort out policy behind closed doors were contentious and sometimes only loosely organized.
  • That was the case when the National Security Council convened a meeting on short notice on the afternoon of Jan. 27. The Situation Room was standing room only, packed with top White House advisers, low-level staffers, Mr. Trump’s social media guru, and several cabinet secretaries. There was no checklist about the preparations for a possible pandemic,
  • Instead, after a 20-minute description by Mr. Azar of his department’s capabilities, the meeting was jolted when Stephen E. Biegun, the newly installed deputy secretary of state, announced plans to issue a “level four” travel warning, strongly discouraging Americans from traveling to China. The room erupted into bickering.
  • A few days later, on the evening of Jan. 30, Mick Mulvaney, the acting White House chief of staff at the time, and Mr. Azar called Air Force One as the president was making the final decision to go ahead with the restrictions on China travel. Mr. Azar was blunt, warning that the virus could develop into a pandemic and arguing that China should be criticized for failing to be transparent.
  • Stop panicking, Mr. Trump told him.That sentiment was present throughout February, as the president’s top aides reached for a consistent message but took few concrete steps to prepare for the possibility of a major public health crisis.
  • As February gave way to March, the president continued to be surrounded by divided factions even as it became clearer that avoiding more aggressive steps was not tenable.
  • the virus was already multiplying across the country — and hospitals were at risk of buckling under the looming wave of severely ill people, lacking masks and other protective equipment, ventilators and sufficient intensive care beds. The question loomed over the president and his aides after weeks of stalling and inaction: What were they going to do?
  • Even then, and even by Trump White House standards, the debate over whether to shut down much of the country to slow the spread was especially fierce.
  • In a tense Oval Office meeting, when Mr. Mnuchin again stressed that the economy would be ravaged, Mr. O’Brien, the national security adviser, who had been worried about the virus for weeks, sounded exasperated as he told Mr. Mnuchin that the economy would be destroyed regardless if officials did nothing.
  • in the end, aides said, it was Dr. Deborah L. Birx, the veteran AIDS researcher who had joined the task force, who helped to persuade Mr. Trump. Soft-spoken and fond of the kind of charts and graphs Mr. Trump prefers, Dr. Birx did not have the rough edges that could irritate the president. He often told people he thought she was elegant.
  • During the last week in March, Kellyanne Conway, a senior White House adviser involved in task force meetings, gave voice to concerns other aides had. She warned Mr. Trump that his wished-for date of Easter to reopen the country likely couldn’t be accomplished. Among other things, she told him, he would end up being blamed by critics for every subsequent death caused by the virus.
Javier E

After Federalist No. 10 | National Affairs - 0 views

  • Federalist No. 10 pertains to the orientation of personal appetites toward public ends, which include both the common good and private rights. The essay recognizes that these appetites cannot be conquered, but they can be conditioned.
  • Madison's solution to the problem of faction — a solution he confines to the four corners of majority rule — is to place majorities in circumstances that encourage deliberation and thus defuse passion.
  • this solution does not depend on any specific constitutional mechanism:
  • ...50 more annotations...
  • Any republic deployed across an extended territory should be relatively free of faction, at least in the aggregate.
  • Yet Madison's solution depends on certain assumptions. Federalist No. 10 assumes politics will occur at a leisurely pace. The regime Madison foresees is relatively passive, not an active manipulator of economic arrangements. And he is able to take for granted a reasonably broad consensus as to the existence if not the content of the public good.
  • These assumptions are now collapsing under the weight of positive government and the velocity of our political life.
  • Given the centrality of Federalist No. 10 to the American constitutional canon, this collapse demands a reckoning. If a pillar of our order is crumbling, something must replace it.
  • That challenge may call for a greater emphasis on the sources of civic virtue and on the means of sustaining it.
  • The possibility that virtue might be coded into the essay is evident at its most elemental level: Federalist No. 10's definition of a faction as a group "united and actuated by some common impulse of passion, or of interest, adverse to the rights of other citizens, or to the permanent and aggregate interests of the community."
  • this definition hinges on an objective understanding of the public good; one cannot comprehend Madison from the perspective of contemporary relativism.
  • Its reader must be committed to a normative concept of the good and occupy a polity in which it is possible for such a concept to be broadly shared.
  • [T]hose who do not believe in an objective moral order cannot 'enter' Madison's system." Thus, belief in such an order, even amid disputes as to its content, constitutes a first unstated assumption of Federalist No. 10.
  • Madison presents a series of choices, repeatedly eliminating one, then bifurcating the other in turn, and eliminating again until he arrives at his solution. One can remove the causes of factions or control their effects. The causes cannot be removed because the propensity to disagree is "sown in the nature of man," arising particularly from the fact that man is "fallible" and his "opinions and his passions...have a reciprocal influence on each other."
  • Precisely because this influence arises from the link between "reason" and "self-love," the latter of which distorts the former, property accounts for "the most common and durable source of factions," the key being its durability.
  • Whereas David Hume's analysis of parties said that those based on self-interest were the most excusable while those based on passions were the most dangerous, Madison warns of the reverse. Those rooted in emotion — including "an attachment to different leaders ambitiously contending for pre-eminence and power" — are the least worrisome precisely because they are based on passions, which Madison believes to be transient.
  • A second assumption of Federalist No. 10 is consequently that irrational passions, which Madison understands to be those not based on interest, are inherently unsustainable and thus are naturally fleeting.
  • Having dismissed minority factions, Madison turns his attention to abusive majorities.
  • if a group is impelled by ill motives, the intrinsic conditions of an extended republic will make it difficult for it to become a majority.
  • A third assumption, then, is that both geographic and constitutional distance will permit the passions to dissipate before their translation into policy.
  • Finally, Madison cautions Jefferson in correspondence about a month before Federalist No. 10's publication that the extended-republic theory "can only hold within a sphere of a mean extent. As in too small a sphere oppressive combinations may be too easily formed agst. the weaker party; so in too extensive a one, a defensive concert may be rendered too difficult against the oppression of those entrusted with the administration."
  • To recapitulate, the assumptions are as follows: The people will share a belief in the existence of an objective moral order, even if they dispute its content; passions, especially when they pertain to attachments or aversions to political leaders, will be unsustainable; government will not dictate the distribution of small economic advantages; geographic and constitutional distance will operate to dissipate passions; and, finally, the territory will not be so large that public opinion cannot form.
  • none of them stands in a form that would be recognizable to Madison today.
  • ASSUMPTIONS UNDONE
  • It is almost universally acknowledged that moral relativism is ascendant in contemporary American society.
  • The question, rather, is whether the foundational assumptions of Federalist No. 10 can withstand the pressure of contemporary communications technology. There is reason to believe they cannot.
  • There is a balance to be struck: Communication is useful insofar as it makes the "mean extent" that was Madison's final assumption larger by enabling the formation of a "defensive concert" through the cultivation of public consensus against an abusive regime. But on Madison's account, the returns on rapid communication should diminish beyond this point because there will be no space in which passions can calm before impulse and decision converge.
  • what is clear is that there are enough opinions dividing the country that any project attempting to form a coherent public will seems doomed.
  • The Madisonian impulse is to look first for institutional solutions that can discipline interest groups. Constitutional mechanisms like judicial review, then, might be used to inhibit factions. But judicial review can be done well or poorly.
  • The empirical conditions not merely of an extensive republic but of 18th-century reality aided in Madison's effort. The deliberate pace of communication did not require an institutional midwife. It was a fact of life. It need hardly be said that, 230 years after the essay's November 1787 publication, this condition no longer obtains. The question is what replaces it.
  • The answer is that the converse of each assumption on which Federalist No. 10 relies is a restraining virtue.
  • If Federalist No. 10 assumes at least consensus as to the existence of an objective morality, pure moral relativism must be challenged.
  • If the immediate translation of preferences into policy is possible but detrimental, patience must intervene. I
  • If technology has erased the constitutional distance between officeholders and constituents, self-restraint and deference may be required.
  • If it has also shrunk attention spans to 140 characters, an ethic of public spiritedness will have to expand them.
  • What unites these is civic virtue, and thus the American regime must now get serious about its recovery
  • He wrote in Federalist No. 55: As there is a degree of depravity in mankind which requires a certain degree of circumspection and distrust, so there are other qualities in human nature which justify a certain portion of esteem and confidence. Republican government presupposes the existence of these qualities in a higher degree than any other form. Were the pictures which have been drawn by the political jealousy of some among us faithful likenesses of the human character, the inference would be that there is not sufficient virtue among men for self-government; and that nothing less than the chains of despotism can restrain them from destroying and devouring one another.
  • At Virginia's ratifying convention, similarly, Madison noted the propensity to assume either the worst or the best from politicians. He replied:
  • But I go on this great republican principle, that the people will have virtue and intelligence to select men of virtue and wisdom. Is there no virtue among us? If there be not, we are in a wretched situation. No theoretical checks — no form of government can render us secure. To suppose that any form of government will secure liberty or happiness without any virtue in the people, is a chimerical idea.
  • Still, the traditional means of inculcating virtue — the family and institutions such as local schools — are themselves under pressure or subject to political capture.
  • A national effort to instill civic virtue would almost certainly careen into the kind of politicization that has been witnessed in Education Department history standards and the like.
  • Consequently, subsidiarity, the diffusion of authority to the most local possible level, would be vital to any effective effort to revive civic virtue. That is, it could not be uniform or imposed from on high. Political leaders could help in cultivating an awareness of its necessity, but not in dictating its precise terms.
  • The first part of this combination is moral virtue, which the ethic of subsidiarity teaches is likelier to come from the home than from school, and from life lessons than from textbooks.
  • Students as early as elementary school routinely learn the virtues of the Bill of Rights, in part because it is shorter and simpler to teach than the main body of the Constitution.
  • The success of civic education is nowhere clearer than in the arguably distorting effect it has had in provoking what Mary Ann Glendon calls "rights talk," the substitution of assertions of rights for persuasive argumentation about politics
  • Of these virtues, patience will surely be the hardest to restore. This is, to be clear, patience not as a private but rather as a civic virtue.
  • It asks that they consider issues in dimensions deeper than a tweet or, more precisely, that they demand that those they elect do so and thus do not expect their passions to be regularly fed.
  • Perhaps the best that can be achieved here is refusing to allow the positive state to reach further into the minutiae of economic life, generating more spaces for minority factions to hide
  • As any reader of Lincoln's Temperance Address knows, neither heroic self-restraint nor clobbering, moralistic education will succeed in inculcating such virtues as patience and moderation. A combined educational program is necessary, and politics in any modern sense can only account for part of it.
  • civic education can achieve constitutional ends. Of course, rights as contemporarily understood are entitlements; they supply us with something. Civic virtue, by contrast, demands something of us, and as such presents a more substantial political challenge.
  • The second is a shift in civic education from the entitlement mentality of the Bill of Rights to the constitutional architecture of the overall regime, with the latter engendering an appreciation of the cadences and distances at which it is intended to function and the limited objects it is intended to attain.
  • While Madison's "mean extent" for a republic has, in the modern United States, far exceeded the scope possible for forming a public will with respect to most particular issues, it may still be possible to form a coherent if thin understanding of the regime and, consequently, a defensive concert to safeguard it.
  • a recognition that virtue is more necessary now than it used to be — when empirical conditions imposed patience and distance — does not rely on virtue in any blind or total sense. It does not, for example, seek to replace the institutional mechanisms Madison elucidates elsewhere with virtue. It simply recognizes that the particular assumptions of Federalist No. 10 no longer operate without added assistance. In other words, as Daniel Mahoney has argued, we must theorize the virtue that the founders could presuppose.
  • The issue, then, is not that civic virtue is all that is important to the Madisonian system; it is that civic virtue is more important than it used to be for one pillar of that system.
Javier E

How China's buses shaped the world's EV revolution - BBC Future - 0 views

  • After around two decades of government support, China now boasts the world's largest market for e-buses, making up more than 95% of global stock. At the end of 2022, China's Ministry of Transport announced that more than three-quarters (77% or 542,600) of all urban buses in the country were "new energy vehicles", a term used by the Chinese government to include pure electric, plug-in hybrids, and fuel cell vehicles powered by alternative fuels such as hydrogen and methanol. In 2022, around 84% of the new energy bus fleet was pure electric.
  • . In 2015, 78% of Chinese urban buses still used diesel or gas, according to the World Resources Institute (WRI). The NGO now estimates that if China follows through on its stated decarbonisation policies, its road transport emissions will peak before 2030.
  • China is also home to some of the world's biggest electric bus manufacturers, such as Yutong, which has been raking up orders across China, Europe and Latin America.
  • ...32 more annotations...
  • "China has really been at the forefront of success in conversion of all vehicles to electric vehicles, especially buses," says Heather Thompson, chief executive officer of the Institute for Transportation and Development Policy (ITDP), a non-profit focusing on sustainable transport solutions. "The rest of the world is trying to do the same, but I think China is really out ahead."
  • At the time of China's 2001 entry into the World Trade Organisation, the international automotive industry was dominated by European, US and Japanese brands. These companies had spent decades perfecting internal combustion engine technology. To compete, Beijing decided to find a new track for its auto industry: making cars that did not use conventional engines.
  • That same year, the central government launched the so-called "863 plan" for EV research and development. There were numerous practical challenges, however, in the way of mass electrification. Not many manufacturers were making new energy vehicles, buyers were few and there was a lack of charging infrastructure in existence. The answer? Buses.
  • "The Chinese government adopted a very smart strategy," says Liu Daizong, ITDP's East Asia director. "They realised quite early on that they should drive [the EV industry] through electric buses," he notes, since their public service status meant Beijing "could have a strong hand on their electrification".
  • "Bus routes were fixed. This means when an electric bus finished a round, it could return to the depot to recharge," explains Xue Lulu, a mobility manager at the World Resources Institute (WRI) China. The typical daily mileage of a Chinese bus ­– 200km (120 miles) – was a realistic range for battery makers to meet.
  • The following year, the country began its large-scale rollout of new energy buses, with the "Ten Cities and Thousand Vehicles" programme. Over three years, the programme aimed to provide 10 cities with financial subsidies to promote 1,000 public-sector new energy vehicles in each, annually. Its goal was to have 10% new energy vehicles in the country by the end of 2012.
  • Strong policy support from both central and regional governments "gave manufacturers confidence in setting up production lines and stepping up research efforts," says Liu.
  • Together, these strong and consistent government signals encouraged Chinese manufacturers to expand their EV production capacity, bring down costs and improve their technologies. One such company was Build Your Dream, better known as BYD. The Shenzhen-based firm, the world's largest EV maker in 2022, ballooned its business a decade before by supplying electric buses and taxis for China's EV pilot cities.
  • "Back then, most buses used diesel, which was a main source of nitrogen oxides (NOx) emissions," says Xue, referring to the air pollution that smothered Beijing and other Chinese cities in the early 2010s. Yet in 2013, a new plan from central government cited tackling air pollution as one of the reasons for rolling out EVs.
  • This addition proved to be critical: it not only connected EV uptake with people's health, it also indirectly tied the e-bus campaign to local officials' political performance, as the central government would soon hand air-quality targets to all provinces.
  • The years 2013 and 2014 proved to be important for China's EV push. For the first time, the central government made EV purchase subsidies available to individual consumers, not just the public sector, opening the floodgate to private ownership. Additionally, it offered discounted electricity tariffs to bus operators to make sure the cost of running electric buses would be "significantly lower than" that of their oil or gas-powered equivalents.
  • The new economic push, plus local government's determination to battle air pollution, generated great enthusiasm for e-buses. By the end of 2015, the number of EV pilot cities rocketed from 25 to 88. In the same year, the central government set a target of 200,000 new energy buses on the road by 2020 and announced a plan to phase out its subsidies for fossil-fuel-powered buses.
  • To further stimulate the market, many cities devised various local policies on top of national incentives. For example, Shenzhen, a southern city with a population of more than 17 million, encouraged government agencies to work with private companies to create a full range of renting mechanisms for bus operators
  • Different cities' bus operators also designed different charging strategies. "Buses in Shenzhen had bigger batteries, so they normally charged overnight," says Xue, of WRI China. Between 2016 and 2020, Shanghai, another electric bus hub, subsidised the electricity e-buses used -- regardless of the hours of the day -- to give them more flexibility in charging.
  • Generous financial support did lead to problems. In 2016, an EV subsidy fraud shook China, with some bus operators found to have exaggerated the number of e-buses they had purchased. So that same year Beijing shifted its EV subsidy rules so bus operators could only receive financial support when a bus's mileage reached 30,000km (19,000 miles).
  • one year later, the government announced the so-called "dual-credit" policy. This allowed new energy vehicle makers to rake up credits which they could sell for cash to those needing to offset "negative credits" generated from making conventional cars.
  • it wasn't only China's buses that had benefitted.China's e-bus campaign helped create a big and stable market for its wider EV industry, brought down the costs and created economies of scale. In 2009, the year the e-bus campaign was rolled out, the total number of new energy vehicles sold stood at 2,300; by 2022, it was 6.9 million, analysis by Huang Zheng,
  • By 2022, the country had also built the world's largest EV charging network, with 1.8 million public charging stations – or two-thirds of the global total – and 3.4 million private equivalents. This means that on average, there is one charging pillar for every 2.5 of China's 13.1 million new energy vehicles.
  • Cold weather is a problem, too, as it can make a battery's charging time longer and its range shorter. The reason China has not achieved 100% electrification for its buses is its northern regions, which have harsh winters, says Xue.
  • To make e-buses truly "green", they should also be charged with renewable power, Wang says. But last year coal power still accounted for 58.4% of China's energy mix, according to the China Electricity Council, a trade body..
  • Globally, however, China is now in a league of its own in uptake of e-buses. By 2018, about 421,000 of the world's 425,000 electric buses were located in China; Europe had about 2,250 and the US owned around 300. A
  • But earlier this year, the European Commission announced a zero-emission target for all new city buses by 2030. And some countries are increasing their overall funding for the transition.
  • In 2020, the European Commission approved Germany's plan to double its aid for e-buses to €650m (£558m/$707m), then again in 2021 to €1.25 billion euros (£1.07m/$1.3bn). And the UK, which last year had the largest electric bus fleet in Europe with 2,226 pure electric and hybrid buses, has announced another £129m ($164m) to help bus operators buy zero-emissions fleets.
  • Countries have thus responded to China's manufacturing lead in divergent ways. "While the US has opted for a more competitive angle by fostering its own e-bus production, regions like Latin America are more open to trade with China due to a more friendly trading setup through [China's] Belt and Road Initiative,"
  • In order to avoid direct competition from Chinese manufacturers, the US has come up with a "school-bus strategy", says Liu. The Chinese don't make the iconic yellow vehicles, so this could ignite American e-bus manufacturing and create a local industry chain, he suggests. Backed by the US Environmental Protection Agency's $5bn (£3.9bn) Clean School Bus Programme, the national effort has so far committed to providing 5,982 buses.
  • In contrast, many Latin American cities, such as the Colombian capital of Bogota and the Chilean capital of Santiago, are greening their traditional bus sectors with the help of Chinese manufacturers, who are the largest providers to the region. In 2020, Chile became the country that had the most Chinese e-buses outside of China, and this year Santiago's public transport operator announced it has ordered 1,022 e-buses from Beijing-based Foton Motor, the biggest overseas deal the firm had received.
  • Chinese manufacturers are likely to receive a lot more orders from Chile and its neighbours in this decade. According to latest research by the global C40 Cities network, the number of electric buses in 32 Latin American cities is expected to increase by more than seven times by 2030, representing an investment opportunity of over $11.3bn (£8.9bn)
  • In June 2023, BloombergNEF forecast half of the world's buses to be entirely battery-powered by 2032, a decade ahead of cars. And by 2026, 36% and 24% of municipal bus sales in Europe and the US, respectively, are expected to be EVs as they begin to catch up with China
  • To meet the global climate goals set by the Paris Agreement, simply switching the world's existing bus fleets might not be enough. According to ITDP, the cumulative greenhouse gas emissions from urban passenger transport globally must stay below the equivalent of 66 gigatonnes CO2 between 2020 and 2050 for the world to meet the 1.5C temperature goal. This emissions limit will only be possible when the world not only adopts electric buses, but goes through a broader shift away from private transport
  • "We can't just focus on [replacing] the buses that exist, we need to actually get many, many more buses on the streets," Thompson adds. She and her team estimate that the world would need about 10 million more buses through 2030, and 46 million more buses cumulatively through 2050, to make public transport good enough to have a shot at achieving the Paris Agreement. And all those buses will need to be electric.
  • In China therefore, even though EVs are being sold faster than ever, its central government has instructed cities to encourage public transport use, as well as walking and riding bikes.
  • In Wang's hometown, meanwhile, which has just over three million residents, the local government has gone one step further and made all bus rides free. All citizens need to do is to swipe an app, with no charge, to get onto the bus. "My aunt loves taking buses now," says Wang. "She says it is so convenient."
Javier E

If We Knew Then What We Know Now About Covid, What Would We Have Done Differently? - WSJ - 0 views

  • A small cadre of aerosol scientists had a different theory. They suspected that Covid-19 was transmitted not so much by droplets but by smaller infectious aerosol particles that could travel on air currents way farther than 6 feet and linger in the air for hours. Some of the aerosol particles, they believed, were small enough to penetrate the cloth masks widely used at the time.
  • For much of 2020, doctors and public-health officials thought the virus was transmitted through droplets emitted from one person’s mouth and touched or inhaled by another person nearby. We were advised to stay at least 6 feet away from each other to avoid the droplets
  • The group had a hard time getting public-health officials to embrace their theory. For one thing, many of them were engineers, not doctors.
  • ...37 more annotations...
  • “My first and biggest wish is that we had known early that Covid-19 was airborne,”
  • , “Once you’ve realized that, it informs an entirely different strategy for protection.” Masking, ventilation and air cleaning become key, as well as avoiding high-risk encounters with strangers, he says.
  • Instead of washing our produce and wearing hand-sewn cloth masks, we could have made sure to avoid superspreader events and worn more-effective N95 masks or their equivalent. “We could have made more of an effort to develop and distribute N95s to everyone,” says Dr. Volckens. “We could have had an Operation Warp Speed for masks.”
  • We didn’t realize how important clear, straight talk would be to maintaining public trust. If we had, we could have explained the biological nature of a virus and warned that Covid-19 would change in unpredictable ways.  
  • In the face of a pandemic, he says, the public needs an early basic and blunt lesson in virology
  • “The science is really important, but if you don’t get the trust and communication right, it can only take you so far,”
  • and mutates, and since we’ve never seen this particular virus before, we will need to take unprecedented actions and we will make mistakes, he says.
  • Since the public wasn’t prepared, “people weren’t able to pivot when the knowledge changed,”
  • By the time the vaccines became available, public trust had been eroded by myriad contradictory messages—about the usefulness of masks, the ways in which the virus could be spread, and whether the virus would have an end date.
  • , the absence of a single, trusted source of clear information meant that many people gave up on trying to stay current or dismissed the different points of advice as partisan and untrustworthy.
  • We didn’t know how difficult it would be to get the basic data needed to make good public-health and medical decisions. If we’d had the data, we could have more effectively allocated scarce resources
  • For much of the pandemic, doctors, epidemiologists, and state and local governments had no way to find out in real time how many people were contracting Covid-19, getting hospitalized and dying
  • Doctors didn’t know what medicines worked. Governors and mayors didn’t have the information they needed to know whether to require masks. School officials lacked the information needed to know whether it was safe to open schools.
  • people didn’t know whether it was OK to visit elderly relatives or go to a dinner party.
  • just months before the outbreak of the pandemic, the Council of State and Territorial Epidemiologists released a white paper detailing the urgent need to modernize the nation’s public-health system still reliant on manual data collection methods—paper records, phone calls, spreadsheets and faxes.
  • While the U.K. and Israel were collecting and disseminating Covid case data promptly, in the U.S. the CDC couldn’t. It didn’t have a centralized health-data collection system like those countries did, but rather relied on voluntary reporting by underfunded state and local public-health systems and hospitals.
  • doctors and scientists say they had to depend on information from Israel, the U.K. and South Africa to understand the nature of new variants and the effectiveness of treatments and vaccines. They relied heavily on private data collection efforts such as a dashboard at Johns Hopkins University’s Coronavirus Resource Center that tallied cases, deaths and vaccine rates globally.
  • With good data, Dr. Ranney says, she could have better managed staffing and taken steps to alleviate the strain on doctors and nurses by arranging child care for them.
  • To solve the data problem, Dr. Ranney says, we need to build a public-health system that can collect and disseminate data and acts like an electrical grid. The power company sees a storm coming and lines up repair crews.
  • If we’d known how damaging lockdowns would be to mental health, physical health and the economy, we could have taken a more strategic approach to closing businesses and keeping people at home.
  • t many doctors say they were crucial at the start of the pandemic to give doctors and hospitals a chance to figure out how to accommodate and treat the avalanche of very sick patients.
  • The measures reduced deaths, according to many studies—but at a steep cost.
  • The lockdowns didn’t have to be so harmful, some scientists say. They could have been more carefully tailored to protect the most vulnerable, such as those in nursing homes and retirement communities, and to minimize widespread disruption.
  • Lockdowns could, during Covid-19 surges, close places such as bars and restaurants where the virus is most likely to spread, while allowing other businesses to stay open with safety precautions like masking and ventilation in place.  
  • If England’s March 23, 2020, lockdown had begun one week earlier, the measure would have nearly halved the estimated 48,600 deaths in the first wave of England’s pandemic
  • If the lockdown had begun a week later, deaths in the same period would have more than doubled
  • The key isn’t to have the lockdowns last a long time, but that they are deployed earlier,
  • It is possible to avoid lockdowns altogether. Taiwan, South Korea and Hong Kong—all countries experienced at handling disease outbreaks such as SARS in 2003 and MERS—avoided lockdowns by widespread masking, tracking the spread of the virus through testing and contact tracing and quarantining infected individuals.
  • Had we known that even a mild case of Covid-19 could result in long Covid and other serious chronic health problems, we might have calculated our own personal risk differently and taken more care.
  • Early in the pandemic, public-health officials were clear: The people at increased risk for severe Covid-19 illness were older, immunocompromised, had chronic kidney disease, Type 2 diabetes or serious heart conditions
  • t had the unfortunate effect of giving a false sense of security to people who weren’t in those high-risk categories. Once case rates dropped, vaccines became available and fear of the virus wore off, many people let their guard down, ditching masks, spending time in crowded indoor places.
  • it has become clear that even people with mild cases of Covid-19 can develop long-term serious and debilitating diseases. Long Covid, whose symptoms include months of persistent fatigue, shortness of breath, muscle aches and brain fog, hasn’t been the virus’s only nasty surprise
  • In February 2022, a study found that, for at least a year, people who had Covid-19 had a substantially increased risk of heart disease—even people who were younger and had not been hospitalized
  • respiratory conditions.
  • Some scientists now suspect that Covid-19 might be capable of affecting nearly every organ system in the body. It may play a role in the activation of dormant viruses and latent autoimmune conditions people didn’t know they had
  •  A blood test, he says, would tell people if they are at higher risk of long Covid and whether they should have antivirals on hand to take right away should they contract Covid-19.
  • If the risks of long Covid had been known, would people have reacted differently, especially given the confusion over masks and lockdowns and variants? Perhaps. At the least, many people might not have assumed they were out of the woods just because they didn’t have any of the risk factors.
Javier E

Our politics isn't designed to protect the public from Covid-19 | George Monbiot | Opin... - 0 views

  • he worst possible people are in charge at the worst possible time. In the UK, the US and Australia, the politics of the governing parties have been built on the dismissal and denial of risk.
  • Just as these politics have delayed the necessary responses to climate breakdown, ecological collapse, air and water pollution, obesity and consumer debt, so they appear to have delayed the effective containment of Covid-19.
  • I believe it is no coincidence that these three governments have responded later than comparable nations have, and with measures that seemed woefully unmatched to the scale of the crisis
  • ...14 more annotations...
  • to have responded promptly and sufficiently would have meant jettisoning an entire structure of political thought developed in these countries over the past half century.
  • Politics is best understood as public relations for particular interests. The interests come first; politics is the means by which they are justified and promoted
  • On the left, the dominant interest groups can be very large – everyone who uses public services, for instance
  • On the right they tend to be much smaller. In the US, the UK and Australia, they are very small indeed: mostly multimillionaires and a very particular group of companies: those whose profits depend on the cavalier treatment of people and planet
  • I’ve seen how the tobacco companies covertly funded an infrastructure of persuasion to deny the impacts of smoking. This infrastructure was then used, often by the same professional lobbyists, to pour doubt on climate science and attack researchers and environmental campaigners.
  • these companies funded rightwing thinktanks and university professors to launch attacks on public health policy in general and create a new narrative of risk, tested on focus groups and honed in the media
  • They reframed responsible government as the “nanny state”, the “health police” and “elf ’n’ safety zealots”. They dismissed scientific findings and predictions as “unfounded fears”, “risk aversion” and “scaremongering”.
  • Public protections were recast as “red tape”, “interference” and “state control”. Government itself was presented as a mortal threat to our freedom.
  • The groups these corporations helped to fund – thinktanks and policy units, lobbyists and political action committees – were then used by other interests: private health companies hoping to break up the NHS, pesticide manufacturers seeking to strike down regulatory controls, junk food manufacturers resisting advertising restrictions, billionaires seeking to avoid tax
  • Between them, these groups refined the justifying ideology for fragmenting and privatising public services, shrinking the state and crippling its ability to govern.
  • Now, in these three nations, this infrastructure is the government. No 10 Downing Street has been filled with people from groups strongly associated with attacks on regulation and state interventio
  • Modern politics is impossible to understand without grasping the pollution paradox. The greater the risk to public health and wellbeing a company presents, the more money it must spend on politics – to ensure it isn’t regulated out of existence. Political spending comes to be dominated by the dirtiest companies
  • The theory on which this form of government is founded can seem plausible and logically consistent. Then reality hits, and we find ourselves in the worst place from which to respond to crisis, with governments that have an ingrained disregard for public safety and a reflexive resort to denial
  • It is what we see today, as the Trump, Johnson and Morrison governments flounder in the face of this pandemic. They are called upon to govern, but they know only that government is the enemy.
Javier E

How Coronavirus Overpowered the World Health Organization - WSJ - 1 views

  • The WHO spent years and hundreds of millions of dollars honing a globe-spanning system of defenses against a pandemic it knew would come. But the virus moved faster than the United Nations agency, exposing flaws in its design and operation that bogged down its response when the world needed to take action.
  • The WHO relied on an honor system to stop a viral cataclysm. Its member states had agreed to improve their ability to contain infectious disease epidemics and to report any outbreaks that might spread beyond their borders. International law requires them to do both.
  • Time and again, countries big and small have failed to do so. The WHO, which isn’t a regulatory agency, lacks the authority to force information from the very governments that finance its programs and elect its leaders
  • ...49 more annotations...
  • years of painstakingly worded treaties, high-level visits and cutting-edge disease surveillance—all meant to encourage good-faith cooperation—have only bitten around the edges of the problem.
  • “It can’t demand entry into a country because they think something bad is happening.”
  • Nearly 200 countries were counting on an agency whose budget—roughly $2.4 billion in 2020—is less than a sixth of the Maryland Department of Health’s. Its donors, largely Western governments, earmark most of that money for causes other than pandemic preparedness.
  • In 2018 and 2019, about 8% of the WHO’s budget went to activities related to pandemic preparedness
  • It took those experts more than four months to agree that widespread mask-wearing helps, and that people who are talking, shouting or singing can expel the virus through tiny particles that linger in the air. In that time, about half a million people died.
  • To write its recommendations, the WHO solicits outside experts, which can be a slow process.
  • the agency’s bureaucratic structure, diplomatic protocol and funding were no match for a pandemic as widespread and fast-moving as Covid-19.
  • As months rolled on, it became clear that governments were reluctant to allow the U.N. to scold, shame or investigate them.
  • In particular, The Wall Street Journal found:
  • * China appears to have violated international law requiring governments to swiftly inform the WHO and keep it in the loop about an alarming infectious-disease cluster
  • —there are no clear consequences for violations
  • * The WHO lost a critical week waiting for an advisory panel to recommend a global public-health emergency, because some of its members were overly hopeful that the new disease wasn’t easily transmissible from one person to another.
  • * The institution overestimated how prepared some wealthy countries were, while focusing on developing countries, where much of its ordinary assistance is directed
  • Public-health leaders say the WHO plays a critical role in global health, leading responses to epidemics and setting health policies and standards for the world. It coordinates a multinational effort every year to pick the exact strains that go into the seasonal flu vaccine, and has provided public guidance and advice on Covid-19 when many governments were silent.
  • The world’s public-health agency was born weak, created in 1948 over U.S. and U.K. reluctance. For decades, it was legally barred from responding to diseases that it learned about from the news. Countries were required to report outbreaks of only four diseases to the WHO: yellow fever, plague, cholera and smallpox, which was eradicated in 1980.
  • SARS convinced governments to retool the WHO. The next year, delegates arrived in the Geneva palace where the League of Nations once met to resolve a centuries-old paradox: Countries don’t report outbreaks, because they fear—correctly—their neighbors will respond by blocking travel and trade.
  • Nearly three times that amount was budgeted for eradicating polio, a top priority for the WHO’s two largest contributors: the U.S. and the Bill & Melinda Gates Foundation.
  • “Everybody pushed back. No sovereign country wants to have this.”
  • China wanted an exemption from immediately reporting SARS outbreaks. The U.S. argued it couldn’t compel its 50 states to cooperate with the treaty. Iran blocked American proposals to make the WHO focus on bioterrorism. Cuba had an hourslong list of objections.
  • Around 3:15 a.m. on the last day, exhausted delegates ran out of time. The treaty they approved, called the International Health Regulations, imagined that each country would quickly and honestly report, then contain, any alarming outbreaks
  • In return, the treaty discouraged restrictions on travel and trade. There would be no consequences for reporting an outbreak—yet no way to punish a country for hiding one.
  • The treaty’s key chokepoint: Before declaring a “public health emergency of international concern,” or PHEIC, the WHO’s director-general would consult a multinational emergency committee and give the country in question a chance to argue against such a declaration.
  • Delegates agreed this could give some future virus a head start but decided it was more important to discourage the WHO from making any unilateral announcements that could hurt their economies.
  • Over the next few years, emergency committees struggled over how to determine whether an outbreak was a PHEIC. It took months to declare emergencies for two deadly Ebola epidemics
  • On Jan. 3, representatives of China’s National Health Commission arrived at the WHO office in Beijing. The NHC acknowledged a cluster of pneumonia cases, but didn’t confirm that the new pathogen was a coronavirus, a fact Chinese officials already knew.
  • That same day, the NHC issued an internal notice ordering laboratories to hand over or destroy testing samples and forbade anyone from publishing unauthorized research on the virus.
  • China’s failure to notify the WHO of the cluster of illnesses is a violation of the International Health Regulations
  • China also flouted the IHR by not disclosing all key information it had to the WHO
  • The WHO said it’s up to member states to decide whether a country has complied with international health law, and that the coming review will address those issues.
  • While Chinese scientists had sequenced the genome and posted it publicly, the government was less forthcoming about how patients might be catching the virus.
  • WHO scientists pored over data they did get, and consulted with experts from national health agencies, including the CDC, which has 33 staff detailed to the WHO.
  • Then a 61-year-old woman was hospitalized in Thailand on Jan. 13.
  • The next day, Dr. van Kerkhove told reporters: “It’s certainly possible that there is limited human-to-human transmission.” MERS and SARS, both coronaviruses, were transmissible among people in close quarters. Epidemiological investigations were under way, she said.
  • On Jan. 22, a committee of 15 scientists haggled for hours over Chinese data and a handful of cases in other countries. Clearly, the virus was spreading between people in China, though there was no evidence of that in other countries. The question now: Was it mainly spreading from very sick people in hospitals and homes—or more widely?
  • The committee met over two days, but was split. They mostly agreed on one point: The information from China “was a little too imprecise to very clearly state that it was time” to recommend an emergency declaration,
  • On Jan. 28, Dr. Tedros and the WHO team arrived for their meeting with Mr. Xi
  • Leaning across three wooden coffee tables, Dr. Tedros pressed for cooperation. In the absence of information, countries might react out of fear and restrict travel to China, he repeated several times throughout the trip. Mr. Xi agreed to allow a WHO-led international team of experts to visit. It took until mid-February to make arrangements and get the team there.
  • China also agreed to provide more data, and Dr. Tedros departed, leaving Dr. Briand behind with a list of mysteries to solve. How contagious was the virus? How much were children or pregnant women at risk? How were cases linked? This was vital information needed to assess the global risk, Dr. Briand said
  • Back in Geneva, Dr. Tedros reconvened the emergency committee. By now it was clear there was human-to-human transmission in other countries. When it met on Jan. 30, the committee got the information the WHO had been seeking. This time the committee recommended and Dr. Tedros declared a global public-health emergency.
  • President Trump and New York Gov. Andrew Cuomo both assured constituents their health systems would perform well. The U.K.’s chief medical officer described the WHO’s advice as largely directed at poor and middle-income countries. As for keeping borders open, by then many governments had already closed them to visitors from China.
  • The WHO shifted focus to the developing world, where it believed Covid-19 would exact the heaviest toll. To its surprise, cases shot up just across the border, in northern Italy.
  • the WHO’s health emergencies unit should report to the director-general and not member states, and its budget should be protected so it doesn’t have to compete with other programs for money.
  • If there were one thing the WHO might have done differently, it would be to offer wealthier countries the type of assistance with public-health interventions that the WHO provides the developing world
  • the WHO’s warning system of declaring a global public-health emergency needs to change. Some want to see a warning system more like a traffic light—with color-coded alarms for outbreaks, based on how worried the public should be
  • Emergency committees need clearer criteria for declaring a global public-health emergency and should publicly explain their thinking
  • The WHO should have more powers to intervene in countries to head off a health crisis
  • Lessons learned
  • Implementing many of those ideas would require herding diplomats back for another monthslong slog of treaty revisions. If and when such talks begin, new governments will likely be in place, and political priorities will float elsewher
  • “Unfortunately, I’m very cynical about this,” he said. “We are living through cycles of panic and neglect. We’ve been through all of this before.”
Javier E

U.S. officials misled the public about the war in Afghanistan, confidential documents r... - 0 views

  • In the interviews, more than 400 insiders offered unrestrained criticism of what went wrong in Afghanistan and how the United States became mired in nearly two decades of warfare. With a bluntness rarely expressed in public, the interviews lay bare pent-up complaints, frustrations and confessions, along with second-guessing and backbiting.
  • Since 2001, more than 775,000 U.S. troops have deployed to Afghanistan, many repeatedly. Of those, 2,300 died there and 20,589 were wounded in action, according to Defense Department figures.
  • They underscore how three presidents — George W. Bush, Barack Obama and Donald Trump — and their military commanders have been unable to deliver on their promises to prevail in Afghanistan.
  • ...39 more annotations...
  • With most speaking on the assumption that their remarks would not become public, U.S. officials acknowledged that their warfighting strategies were fatally flawed and that Washington wasted enormous sums of money trying to remake Afghanistan into a modern nation.
  • The interviews also highlight the U.S. government’s botched attempts to curtail runaway corruption, build a competent Afghan army and police force, and put a dent in Afghanistan’s thriving opium trade.
  • Since 2001, the Defense Department, State Department and U.S. Agency for International Development have spent or appropriated between $934 billion and $978 billion
  • Those figures do not include money spent by other agencies such as the CIA and the Department of Veterans Affairs, which is responsible for medical care for wounded veterans.
  • Several of those interviewed described explicit and sustained efforts by the U.S. government to deliberately mislead the public. They said it was common at military headquarters in Kabul — and at the White House — to distort statistics to make it appear the United States was winning the war when that was not the case.
  • SIGAR departed from its usual mission of performing audits and launched a side venture. Titled “Lessons Learned,” the $11 million project was meant to diagnose policy failures in Afghanistan so the United States would not repeat the mistakes the next time it invaded a country or tried to rebuild a shattered one.
  • the reports, written in dense bureaucratic prose and focused on an alphabet soup of government initiatives, left out the harshest and most frank criticisms from the interviews.
  • “We found the stabilization strategy and the programs used to achieve it were not properly tailored to the Afghan context, and successes in stabilizing Afghan districts rarely lasted longer than the physical presence of coalition troops and civilians,” read the introduction to one report released in May 2018.
  • To augment the Lessons Learned interviews, The Post obtained hundreds of pages of previously classified memos about the Afghan war that were dictated by Defense Secretary Donald H. Rumsfeld between 2001 and 2006.
  • Together, the SIGAR interviews and the Rumsfeld memos pertaining to Afghanistan constitute a secret history of the war and an unsparing appraisal of 18 years of conflict.
  • With their forthright descriptions of how the United States became stuck in a faraway war, as well as the government's determination to conceal them from the public, the Lessons Learned interviews broadly resemble the Pentagon Papers, the Defense Department's top-secret history of the Vietnam War.
  • running throughout are torrents of criticism that refute the official narrative of the war, from its earliest days through the start of the Trump administration.
  • At the outset, for instance, the U.S. invasion of Afghanistan had a clear, stated objective — to retaliate against al-Qaeda and prevent a repeat of the Sept. 11, 2001, attacks.
  • Yet the interviews show that as the war dragged on, the goals and mission kept changing and a lack of faith in the U.S. strategy took root inside the Pentagon, the White House and the State Department.
  • Fundamental disagreements went unresolved. Some U.S. officials wanted to use the war to turn Afghanistan into a democracy. Others wanted to transform Afghan culture and elevate women’s rights. Still others wanted to reshape the regional balance of power among Pakistan, India, Iran and Russia.
  • The Lessons Learned interviews also reveal how U.S. military commanders struggled to articulate who they were fighting, let alone why.
  • Was al-Qaeda the enemy, or the Taliban? Was Pakistan a friend or an adversary? What about the Islamic State and the bewildering array of foreign jihadists, let alone the warlords on the CIA’s payroll? According to the documents, the U.S. government never settled on an answer.
  • As a result, in the field, U.S. troops often couldn’t tell friend from foe.
  • The United States has allocated more than $133 billion to build up Afghanistan — more than it spent, adjusted for inflation, to revive the whole of Western Europe with the Marshall Plan after World War II.
  • As commanders in chief, Bush, Obama and Trump all promised the public the same thing. They would avoid falling into the trap of "nation-building" in Afghanistan.
  • U.S. officials tried to create — from scratch — a democratic government in Kabul modeled after their own in Washington. It was a foreign concept to the Afghans, who were accustomed to tribalism, monarchism, communism and Islamic law.
  • During the peak of the fighting, from 2009 to 2012, U.S. lawmakers and military commanders believed the more they spent on schools, bridges, canals and other civil-works projects, the faster security would improve. Aid workers told government interviewers it was a colossal misjudgment, akin to pumping kerosene on a dying campfire just to keep the flame alive.
  • One unnamed executive with the U.S. Agency for International Development (USAID) guessed that 90 percent of what they spent was overkill: “We lost objectivity. We were given money, told to spend it and we did, without reason.”Lessons Learned interview | 10/7/2016Tap to view full document
  • The gusher of aid that Washington spent on Afghanistan also gave rise to historic levels of corruption.
  • In public, U.S. officials insisted they had no tolerance for graft. But in the Lessons Learned interviews, they admitted the U.S. government looked the other way while Afghan power brokers — allies of Washington — plundered with impunity.
  • Christopher Kolenda, an Army colonel who deployed to Afghanistan several times and advised three U.S. generals in charge of the war, said that the Afghan government led by President Hamid Karzai had “self-organized into a kleptocracy”Christopher Kolenda | Lessons Learned interview | 4/5/2016Tap to view full document by 2006 — and that U.S. officials failed to recognize the lethal threat it posed to their strategy.
  • By allowing corruption to fester, U.S. officials told interviewers, they helped destroy the popular legitimacy of the wobbly Afghan government they were fighting to prop up. With judges and police chiefs and bureaucrats extorting bribes, many Afghans soured on democracy and turned to the Taliban to enforce order.
  • None expressed confidence that the Afghan army and police could ever fend off, much less defeat, the Taliban on their own. More than 60,000 members of Afghan security forces have been killed, a casualty rate that U.S. commanders have called unsustainable.
  • In the Lessons Learned interviews, however, U.S. military trainers described the Afghan security forces as incompetent, unmotivated and rife with deserters. They also accused Afghan commanders of pocketing salaries — paid by U.S. taxpayers — for tens of thousands of “ghost soldiers.”
  • an army and national police force that can defend the country without foreign help.
  • Year after year, U.S. generals have said in public they are making steady progress on the central plank of their strategy: to train a robust Afgh
  • From the beginning, Washington never really figured out how to incorporate a war on drugs into its war against al-Qaeda. By 2006, U.S. officials feared that narco-traffickers had become stronger than the Afghan government and that money from the drug trade was powering the insurgency
  • throughout the Afghan war, documents show that U.S. military officials have resorted to an old tactic from Vietnam — manipulating public opinion. In news conferences and other public appearances, those in charge of the war have followed the same talking points for 18 years. No matter how the war is going — and especially when it is going badly — they emphasize how they are making progress.
  • Two months later, Marin Strmecki, a civilian adviser to Rumsfeld, gave the Pentagon chief a classified, 40-page report loaded with more bad news. It said “enormous popular discontent is building” against the Afghan government because of its corruption and incompetence. It also said that the Taliban was growing stronger, thanks to support from Pakistan, a U.S. ally.
  • Since then, U.S. generals have almost always preached that the war is progressing well, no matter the reality on the battlefield.
  • he Lessons Learned interviews contain numerous admissions that the government routinely touted statistics that officials knew were distorted, spurious or downright false
  • A person identified only as a senior National Security Council official said there was constant pressure from the Obama White House and Pentagon to produce figures to show the troop surge of 2009 to 2011 was working, despite hard evidence to the contrary.
  • Even when casualty counts and other figures looked bad, the senior NSC official said, the White House and Pentagon would spin them to the point of absurdity. Suicide bombings in Kabul were portrayed as a sign of the Taliban’s desperation, that the insurgents were too weak to engage in direct combat. Meanwhile, a rise in U.S. troop deaths was cited as proof that American forces were taking the fight to the enemy.
  • “And this went on and on for two reasons,” the senior NSC official said, “to make everyone involved look good, and to make it look like the troops and resources were having the kind of effect where removing them would cause the country to deteriorate.”
Javier E

Opinion | Vaccine Hesitancy Is About Trust and Class - The New York Times - 0 views

  • The world needs to address the root causes of vaccine hesitancy. We can’t go on believing that the issue can be solved simply by flooding skeptical communities with public service announcements or hectoring people to “believe in science.”
  • For the past five years, we’ve conducted surveys and focus groups abroad and interviewed residents of the Bronx to better understand vaccine avoidance.
  • We’ve found that people who reject vaccines are not necessarily less scientifically literate or less well-informed than those who don’t. Instead, hesitancy reflects a transformation of our core beliefs about what we owe one another.
  • ...43 more annotations...
  • Over the past four decades, governments have slashed budgets and privatized basic services. This has two important consequences for public health
  • First, people are unlikely to trust institutions that do little for them.
  • second, public health is no longer viewed as a collective endeavor, based on the principle of social solidarity and mutual obligation. People are conditioned to believe they’re on their own and responsible only for themselves.
  • an important source of vaccine hesitancy is the erosion of the idea of a common good.
  • “People are thinking, ‘If the government isn’t going to do anything for us,’” said Elden, “‘then why should we participate in vaccines?’”
  • Since the spring, when most American adults became eligible for Covid vaccines, the racial gap in vaccination rates between Black and white people has been halved. In September, a national survey found that vaccination rates among Black and white Americans were almost identical.
  • Other surveys have determined that a much more significant factor was college attendance: Those without a college degree were the most likely to go unvaccinated.
  • Education is a reliable predictor of socioeconomic status, and other studies have similarly found a link between income and vaccination.
  • It turns out that the real vaccination divide is class.
  • compared with white Americans, communities of color do experience the American health care system differently. But a closer look at the data reveals a more complicated picture.
  • during the 1950s polio campaigns, for example, most people saw vaccination as a civic duty.
  • But as the public purse shrunk in the 1980s, politicians insisted that it’s no longer the government’s job to ensure people’s well-being; instead, Americans were to be responsible only for themselves and their own bodies
  • Entire industries, such as self-help and health foods, have sprung up on the principle that the key to good health lies in individuals making the right choices.
  • Without an idea of the common good, health is often discussed using the language of “choice.”
  • there are problems with reducing public health to a matter of choice. It gives the impression that individuals are wholly responsible for their own health.
  • This is despite growing evidence that health is deeply influenced by factors outside our control; public health experts now talk about the “social determinants of health,” the idea that personal health is never simply just a reflection of individual lifestyle choices, but also the class people are born into, the neighborhood they grew up in and the race they belong to.
  • food deserts and squalor are not easy problems to solve — certainly not by individuals or charities — and they require substantial government action.
  • Many medical schools teach “motivational interviewing,”
  • the deeper problem:
  • Being healthy is not cheap. Studies indicate that energy-dense foods with less nutritious value are more affordable, and low-cost diets are linked to obesity and insulin resistance.
  • This isn’t surprising, since we shop for doctors and insurance plans the way we do all other goods and services
  • Another problem with reducing well-being to personal choice is that this treats health as a commodity.
  • mothers devoted many hours to “researching” vaccines, soaking up parental advice books and quizzing doctors. In other words, they act like savvy consumers
  • When thinking as a consumer, people tend to downplay social obligations in favor of a narrow pursuit of self-interest. As one parent told Reich, “I’m not going to put my child at risk to save another child.”
  • Such risk-benefit assessments for vaccines are an essential part of parents’ consumer research.
  • Vaccine uptake is so high among wealthy people because Covid is one of the gravest threats they face. In some wealthy Manhattan neighborhoods, for example, vaccination rates run north of 90 percent.
  • For poorer and working-class people, though, the calculus is different: Covid-19 is only one of multiple grave threats.
  • When viewed in the context of the other threats they face, Covid no longer seems uniquely scary.
  • Most of the people we interviewed in the Bronx say they are skeptical of the institutions that claim to serve the poor but in fact have abandoned them.
  • he and his friends find reason to view the government’s sudden interest in their well-being with suspicion. “They are over here shoving money at us,” a woman told us, referring to a New York City offer to pay a $500 bonus to municipal workers to get vaccinated. “And I’m asking, why are you so eager, when you don’t give us money for anything else?”
  • These views reinforce the work of social scientists who find a link between a lack of trust and inequality. And without trust, there is no mutual obligation, no sense of a common good.
  • The experience of the 1960s suggests that when people feel supported through social programs, they’re more likely to trust institutions and believe they have a stake in society’s health.
  • Research shows that private systems not only tend to produce worse health outcomes than public ones, but privatization creates what public health experts call “segregated care,” which can undermine the feelings of social solidarity that are critical for successful vaccination drives
  • In one Syrian city, for example, the health care system now consists of one public hospital so underfunded that it is notorious for poor care, a few private hospitals offering high-quality care that are unaffordable to most of the population, and many unlicensed and unregulated private clinics — some even without medical doctors — known to offer misguided health advice. Under such conditions, conspiracy theories can flourish; many of the city’s residents believe Covid vaccines are a foreign plot.
  • In many developing nations, international aid organizations are stepping in to offer vaccines. These institutions are sometimes more equitable than governments, but they are often oriented to donor priorities, not community needs.
  • “We have starvation and women die in childbirth.” one tribal elder told us, “Why do they care so much about polio? What do they really want?”
  • In America, anti-vaccine movements are as old as vaccines themselves; efforts to immunize people against smallpox prompted bitter opposition in the turn of the last century. But after World War II, these attitudes disappeared. In the 1950s, demand for the polio vaccine often outstripped supply, and by the late 1970s, nearly every state had laws mandating vaccinations for school with hardly any public opposition.
  • What changed? This was the era of large, ambitious government programs like Medicare and Medicaid.
  • The anti-measles policy, for example, was an outgrowth of President Lyndon Johnson’s Great Society and War on Poverty initiatives.
  • While the reasons vary by country, the underlying causes are the same: a deep mistrust in local and international institutions, in a context in which governments worldwide have cut social services.
  • Only then do the ideas of social solidarity and mutual obligation begin to make sense.
  • The types of social programs that best promote this way of thinking are universal ones, like Social Security and universal health care.
  • If the world is going to beat the pandemic, countries need policies that promote a basic, but increasingly forgotten, idea: that our individual flourishing is bound up in collective well-being.
Javier E

Opinion | There's a Reason There Aren't Enough Teachers in America. Many Reasons, Actua... - 0 views

  • Here are just a few of the longstanding problems plaguing American education: a generalized decline in literacy; the faltering international performance of American students; an inability to recruit enough qualified college graduates into the teaching profession; a lack of trained and able substitutes to fill teacher shortages; unequal access to educational resources; inadequate funding for schools; stagnant compensation for teachers; heavier workloads; declining prestige; and deteriorating faculty morale.
  • Nine-year-old students earlier this year revealed “the largest average score decline in reading since 1990, and the first ever score decline in mathematics,”
  • In the latest comparison of fourth grade reading ability, the United States ranked below 15 countries, including Russia, Ireland, Poland and Bulgaria.
  • ...49 more annotations...
  • Teachers are not only burnt out and undercompensated, they are also demoralized. They are being asked to do things in the name of teaching that they believe are mis-educational and harmful to students and the profession. What made this work good for them is no longer accessible. That is why we are hearing so many refrains of “I’m not leaving the profession, my profession left me.”
  • We find there are at least 36,000 vacant positions along with at least 163,000 positions being held by underqualified teachers, both of which are conservative estimates of the extent of teacher shortages nationally.
  • “The current problem of teacher shortages (I would further break this down into vacancy and under-qualification) is higher than normal.” The data, Nguyen continued, “indicate that shortages are worsening over time, particularly over the last few years
  • a growing gap between the pay of all college graduates and teacher salaries from 1979 to 2021, with a sharp increase in the differential since 2010
  • The number of qualified teachers is declining for the whole country and the vast majority of states.
  • Wages are essentially unchanged from 2000 to 2020 after adjusting for inflation. Teachers have about the same number of students. But, teacher accountability reforms have increased the demands on their positions.
  • The pandemic was very difficult for teachers. Their self-reported level of stress was about as twice as high during the pandemic compared to other working adults. Teachers had to worry both about their personal safety and deal with teaching/caring for students who are grieving lost family members.
  • the number of students graduating from college with bachelor’s degrees in education fell from 176,307 in 1970-71 to 104,008 in 2010-11 to 85,058 in 2019-20.
  • We do see that southern states (e.g., Mississippi, Alabama, Georgia and Florida) have very high vacancies and high vacancy rates.”
  • By 2021, teachers made $1,348, 32.9 percent less than what other graduates made, at $2,009.
  • These gaps play a significant role in determining the quality of teachers,
  • Sixty percent of teachers and 65 percent of principals reported believing that systemic racism exists. Only about 20 percent of teachers and principals reported that they believe systemic racism does not exist, and the remainder were not sure
  • “We find,” they write, “that teachers’ cognitive skills differ widely among nations — and that these differences matter greatly for students’ success in school. An increase of one standard deviation in teacher cognitive skills is associated with an increase of 10 to 15 percent of a standard deviation in student performance.”
  • teachers have lower cognitive skills, on average, in countries with greater nonteaching job opportunities for women in high-skill occupations and where teaching pays relatively less than other professions.
  • the scholars found that the cognitive skills of teachers in the United States fell in the middle ranks:Teachers in the United States perform worse than the average teacher sample-wide in numeracy, with a median score of 284 points out of a possible 500, compared to the sample-wide average of 292 points. In literacy, they perform slightly better than average, with a median score of 301 points compared to the sample-wide average of 295 points.
  • Increasing teacher numeracy skills by one standard deviation increases student performance by nearly 15 percent of a standard deviation on the PISA math test. Our estimate of the effect of increasing teacher literacy skills on students’ reading performance is slightly smaller, at 10 percent of a standard deviation.
  • How, then, to raise teacher skill level in the United States? Hanushek and his two colleagues have a simple answer: raise teacher pay to make it as attractive to college graduates as high-skill jobs in other fields.
  • policymakers will need to do more than raise teacher pay across the board to ensure positive results. They must ensure that higher salaries go to more effective teachers.
  • The teaching of disputed subjects in schools has compounded many of the difficulties in American education.
  • The researchers found that controversies over critical race theory, sex education and transgender issues — aggravated by divisive debates over responses to Covid and its aftermath — are inflicting a heavy toll on teachers and principals.
  • “On top of the herculean task of carrying out the essential functions of their jobs,” they write, “educators increasingly find themselves in the position of addressing contentious, politicized issues in their schools as the United States has experienced increasing political polarization.”
  • Teachers and principals, they add, “have been pulled in multiple directions as they try to balance and reconcile not only their own beliefs on such matters but also the beliefs of others around them, including their leaders, fellow staff, students, and students’ family members.”
  • These conflicting pressures take place in a climate where “emotions in response to these issues have run high within communities, resulting in the harassment of educators, bans against literature depicting diverse characters, and calls for increased parental involvement in deciding academic content.”
  • Forty-eight percent of principals and 40 percent of teachers reported that the intrusion of political issues and opinions in school leadership or teaching, respectively, was a job-related stressor. By comparison, only 16 percent of working adults indicated that the intrusion of political issues and opinions in their jobs was a source of job-related stress
  • In 1979, the average teacher weekly salary (in 2021 dollars) was $1,052, 22.9 percent less than other college graduates’, at $1,364
  • Nearly all Black or African American principals (92 percent) and teachers (87 percent) reported believing that systemic racism exists.
  • White educators working in predominantly white school systems reported substantially more pressure to deal with politically divisive issues than educators of color and those working in mostly minority schools: “Forty-one percent of white teachers and 52 percent of white teachers and principals selected the intrusion of political issues and opinions into their professions as a job-related stressor, compared with 36 percent of teachers of color and principals of color.
  • and opinions into their professions as a job-related stressor, compar
  • A 54 percent majority of teachers and principals said there “should not be legal limits on classroom conversations about racism, sexism, and other topics,” while 20 percent said there should be legislated constraint
  • Voters, in turn, are highly polarized on the teaching of issues impinging on race or ethnicity in public schools. The Education Next 2022 Survey asked, for example:Some people think their local public schools place too little emphasis on slavery, racism and other challenges faced by Black people in the United States. Other people think their local public schools place too much emphasis on these topics. What is your view about your local public schools?
  • Among Democrats, 55 percent said too little emphasis was placed on slavery, racism and other challenges faced by Black people, and 8 percent said too much.
  • Among Republicans, 51 said too much and 10 percent said too little.
  • Because of the lack of reliable national data, there is widespread disagreement among scholars of education over the scope and severity of the shortage of credentialed teachers, although there is more agreement that these problems are worse in low-income, high majority-minority school systems and in STEM and special education faculties.
  • Public schools increasingly are targets of conservative political groups focusing on what they term “Critical Race Theory,” as well as issues of sexuality and gender identity. These political conflicts have created a broad chilling effect that has limited opportunities for students to practice respectful dialogue on controversial topics and made it harder to address rampant misinformation.
  • The chilling effect also has led to marked declines in general support for teaching about race, racism, and racial and ethnic diversity.
  • These political conflicts, the authors wrote,have made the already hard work of public education more difficult, undermining school management, negatively impacting staff, and heightening student stress and anxiety. Several principals shared that they were reconsidering their own roles in public education in light of the rage at teachers and rage at administrators’ playing out in their communities.
  • State University of New York tracked trends on “four interrelated constructs: professional prestige, interest among students, preparation for entry, and job satisfaction” for 50 years, from the 1970s to the present and founda consistent and dynamic pattern across every measure: a rapid decline in the 1970s, a swift rise in the 1980s, relative stability for two decades, and a sustained drop beginning around 2010. The current state of the teaching profession is at or near its lowest levels in 50 years.
  • Who among the next generation of college graduates will choose to teach?
  • Perceptions of teacher prestige have fallen between 20 percent and 47 percent in the last decade to be at or near the lowest levels recorded over the last half century
  • Interest in the teaching profession among high school seniors and college freshmen has fallen 50 percent since the 1990s, and 38 percent since 2010, reaching the lowest level in the last 50 years
  • the proportion of college graduates that go into teaching is at a 50-year low
  • Teachers’ job satisfaction is also at the lowest level in five decades, with the percent of teachers who feel the stress of their job is worth it dropping from 81 percent to 42 percent in the last 15 years
  • The combination of these factors — declining prestige, lower pay than other professions that require a college education, increased workloads, and political and ideological pressures — is creating both intended and unintended consequences for teacher accountability reforms mandating tougher licensing rules, evaluations and skill testing.
  • Education policy over the past decade has focused considerable effort on improving human capital in schools through teacher accountability. These reforms, and the research upon which they drew, were based on strong assumptions about how accountability would affect who decided to become a teacher. Counter to most assumptions, our findings document how teacher accountability reduced the supply of new teacher candidates by, in part, decreasing perceived job security, satisfaction and autonomy.
  • The reforms, Kraft and colleagues continued, increasedthe likelihood that schools could not fill vacant teaching positions. Even more concerning, effects on unfilled vacancies were concentrated in hard-to-staff schools that often serve larger populations of low-income students and students of color
  • We find that evaluation reforms increased the quality of newly hired novice teachers by reducing the number of teachers that graduated from the least selective institutions
  • We find no evidence that evaluation reforms served to attract teachers who attended the most selective undergraduate institutions.
  • In other words, the economic incentives, salary structure and work-life pressures characteristic of public education employment have created a climate in which contemporary education reforms have perverse and unintended consequences that can worsen rather than alleviate the problems facing school systems.
  • If so, to improve the overall quality of the nation’s more than three million public schoolteachers, reformers may want to give priority to paychecks, working conditions, teacher autonomy and punishing workloads before attempting to impose higher standards, tougher evaluations and less job security.
1 - 20 of 3790 Next › Last »
Showing 20 items per page