Skip to main content

Home/ History Readings/ Group items tagged grants

Rss Feed Group items tagged

5More

The One Word That Bars Trump From Pardoning Himself - The Atlantic - 0 views

  • Based solely on other uses of grant in the Constitution, a person could reasonably determine that a president cannot grant himself a pardon. But in evaluating the meaning of the Constitution’s words, the text of the Constitution isn’t all that counts
  • The most common interpretive method these days—championed by Justice Antonin Scalia and now broadly popular among conservatives—is to look for evidence of a term’s “original public meaning.” That, theoretically, is the meaning that ordinary English speakers of the late 18th century would have attached to a given term when coming upon it in a legal document like the Constitution.
  • to the extent that the most popular contemporaneous law dictionary is valuable in understanding what ordinary speakers of the founding era meant by “granting,” it seems clear that they probably had in mind an interpersonal transfer.
  • ...2 more annotations...
  • in the time period from 1750 to 1800, essentially none of these appears. Transitive uses of the verb—“grant me,” “grant him,” “grant her,” “grant us,” “grant you,” and the like, where the person receiving the grant is different from the person doing the granting—are all common. But reflexive uses, where the person doing the granting is also the person on the receiving end? All but nonexistent
  • Can Donald Trump pardon himself? Perhaps, but that’s not the question the Constitution requires us to ask. Can Donald Trump grant himself a pardon? The evidence, at least according to the text of the Constitution and its original meaning, says no.
7More

GOPers Crack Down On The Private Election Grants That Helped Avoid A Pandemic Fiasco | ... - 0 views

  • “I have no doubt that the concern stems from what happened in the 2020 election,” said Rick Hasen, a UC-Irvine law professor who runs the election law blog and has written several books about election administration. “Anything that helped that election run smoothly and effectively and cleanly is now the target for attack.”
  • The grant programs allowed many election administrators put on what one expert at NYU’s Brennan Center described as their “dream” elections. About one in every five local offices accepted the philanthropic funding.
  • The Republican push to prohibit private election election comes after the charity grants occupied a piece of conspiracy theories floating around the 2020 election — fueled by President Trump and his allies’ ongoing beef with Mark Zuckerberg, the Facebook founder whose charity put up more than $400 million in grants for election administration last year.
  • ...4 more annotations...
  • “One impact of the proposal to ban private grants is to reduce funding for elections,” said Christian Grose, a University of Southern California professor and director of its USC Schwarzenegger Institute for State and Global Policy, which also distributed election grants in 2020. “By reducing funding for elections, it makes it harder and it closes polling places.”
  • Conservative activists rushed to court last year to try to block the grant programs, with lawsuits mainly targeting big, Democratic-leaning cities that had accepted the CTCL funding. The lawsuits — usually spearheaded by a conservative group known as the Amistad Project — claimed that CTCL was purposely funneling its donations towards jurisdictions with “progressive” voting patterns, supposedly with the goal of boosting urban turnout and electing Democrats. 
  • Such claims ignored that the grant was open to any jurisdiction that wanted to apply for it and met its criteria.
  • In Pennsylvania, more than half of the jurisdictions that accepted the CTCL local funding broke for Trump in 2020. Ninety-eight of the 117 Texas recipient jurisdictions voted for Trump in 2016. 
300More

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
15More

About Japan: A Teacher's Resource | Women in Modern Japanese History | Japan Society - 0 views

  • This paper addresses these assumptions about Japanese women as “behind” and suggests that their lives have been far more varied throughout history and in the present than the stereotypes suggest.
  • Rather than assuming that the west is somehow ahead of the rest of the world, I use what historians call the concept of “coevalness” throughout. By “coeval,” I mean that the situation of women around the world unfolded in relatively similar ways at roughly the same time.
  • I submit that it would be a mistake to blame Japanese women’s supposedly low status on “tradition” or “culture.” This assertion prevents us from seeing the diversity in the historical record and the ways patriarchy—that is, male dominance—was remade and even strengthened in the modern period.
  • ...12 more annotations...
  • Western visitors drew on the writings of Charles Fourier (1772-1837) and others and used the “low” status of women among other “barbaric” Japanese practices to justify the previously-mentioned series of unequal treaties.
  • The overturning of these treaties was one of the main goals of the Japanese state after 1868, a goal achieved by the mid-1890s. This focus led to considerable discussion and reform across several decades. Government officials, intellectuals, reformers in the Japan and across East Asia focused on the “woman question” as a critical part of modernization, necessary to build a strong state and attain equal status with the western powers. Strikingly, they tended to accept the idea that the status of women in East Asia was low. In the process, commentators of all stripes painted a picture of women’s status in the premodern East Asian past that was static and uniform, a view not at all in line with the richness and diversity of the past, a past where some women were highly educated and produced masterful works of art and literature and others had political power and influence.[4]
  • Let us turn briefly to the period before Japan’s transition to modernity. Until quite recently, scholars have tended to see the preceding Edo/Tokugawa (hereafter Edo) period (1600-1868) as representing the nadir of women’s status. Scholars assumed that warrior rule and Neo-Confucian discourses led to an unparalleled subordination of women. Recent studies have challenged this view and revealed a more complicated and nuanced picture, one where women’s lives varied widely by status, age, locale, and time period. In short, scholars have demonstrated that gender ideals promoted by male scholars that stress women’s inferiority tell us little about the lives of the vast majority of women. Moreover, research shows that merchant women enjoyed more property rights than women of samurai (warrior) and peasant backgrounds.
  • One example that demonstrates the variety of women’s experiences lies in the area of education. Access to education grew dramatically during the Edo period. Particularly notable are the growth of what are sometimes called temple schools, where girls and boys learned basic reading and arithmetic. As a result of this development, Japan had one of the highest literacy rates in the early modern world. Moreover, some women of means had access to quite elite forms of education equivalent to those available to elite men
  • This situation would change dramatically in the modern period, for the advent of the nation-state after 1868 and the establishment of universal education in 1872 would eliminate the variety of potential experiences women had, and replace them with a uniform education deemed appropriate to women. In short, after 1872, a greater number of women had access to education than ever before, but the content of this education was more circumscribed than it had been in the past
  • Modern times saw concrete changes in gender roles within households especially in urban settings. In the Edo period, households in villages were productive units where husbands and wives shared labor
  • But as some people moved to the cities—a trend that accelerated in the modern period—husbands went out to work leaving middle class wives at home. Urban families increasingly lived in nuclear units, rather than in extended family groups. In the process, middle class women’s lives increasingly became defined in terms of motherhood, something that had not been highly valued in the Edo period. From the turn of the twentieth century on, middle class women in particular were called upon to be “good wives and wise mothers” (ryōsai kenbo)
  • For poor women, work in the textile mills and sex work continued to be the main occupations as they had in the preceding period. Some scholars have pointed out that Japan’s successful industrial transformation in the nineteenth century was accomplished on the backs of poor women, especially those who toiled in the textile mills. Meanwhile, some women from the middle class were able to pursue a limited number of professions including work as physicians, nurses, and teachers. As Sally Hastings has demonstrated, state policy actually supported these limited opportunities for women because the work was deemed appropriate to their gender. We should not imagine that all Japanese women before 1945 were wives and mothers; professional women existed in the prewar era. In fact, this group of professional women in the 1920s and 1930s played a role in the prewar suffrage movement. They also helped authorize a public role for women and laid the groundwork for women’s enthusiastic participation in political life in the immediate post World War II years.
  • The 1920s saw the rise of a vibrant women’s rights movement in Japan, one related to the movement for women’s suffrage in the west after World War I when American and British women finally gained the vote. The Japanese government reacted to women’s demands with a gradualist approach. In 1925, it granted universal manhood suffrage and by 1930 and 1931, the lower house of the Diet (legislature) passed bills granting women’s suffrage at the local level. However, as the political situation abroad changed dramatically in the 1930s and the Japanese military began a war in China, the movement to grant women’s political rights went by the wayside. Women’s rights advocates mostly supported the state during the period, hoping that their loyalty would enable them to influence policy on mothers and children.
  • Women’s political rights were granted after the war in 1945. But the story of how they came to be deserves some attention. The main issue here is what Mire Koikari has called the “myth of American emancipation of Japanese women,” for this period has often been misunderstood. In the fall of 1945, the head of the Occupation (SCAP) General Douglas MacArthur presented a list of demands to the Japanese government, including the demand that women get the vote. However, feminist leader Ichikawa Fusae and her fellow activists had already been lobbying the Japanese cabinet to grant women’s suffrage even before the Occupation arrived. Ichikawa did not want a foreign power to be responsible for granting women the right to vote. The Japanese cabinet was supportive of her initiative. Nevertheless, the subsequent course of events—a revised electoral law granting women the right to vote and stand for office was passed in December 1945—meant that the Occupation could take credit for enfranchising women. This view overlooks the efforts of Japanese women as early as the 1920s as well as their activities in the immediate aftermath of war, as well as the Japanese government’s support of their demands.
  • Most familiar to western audiences is the story of Beate Sirota Gordon’s role in proposing the gender equality clauses in the postwar Japanese constitution (Articles 14 and 24). At the time, Gordon, who was born in Vienna to Russian-Jewish parents but grew up in Japan, had returned to work for the Occupation as a naturalized American citizen. She was part of a group of Americans charged with the task of rewriting the constitution. Gordon later published her memoir The Only Woman in the Room (1997) relating her critical role in writing this legislation. She has been celebrated in some western and Japanese circles ever since. Yet Gordon’s story has also been subject to critique from several angles. For example, Mire Koikari sheds lights on Gordon’s participation in “imperial feminism,” since Gordon portrayed herself and was portrayed by others as liberating Japanese women. As Koikari adds, “In drafting women’s rights articles, Gordon tapped into her childhood memory where the Orientalist imagery of oppressed and helpless Japanese women predominated.”[7]
  • The point here is not to ignore Gordon’s contribution to the constitution for she did indeed draft the gender equality legislation, but rather to place her work in a larger context. In fact, as we saw, Japanese women had been working for political rights for decades. The granting of women’s political rights and guarantees of gender equality cannot be seen as a case where a progressive west granted passive Japanese women political rights.  (On a different but related note, acknowledging the agency of Japanese women also means recognizing their complicity in wartime militarism and nationalism, as Koikari emphasizes.)
48More

Adam Serwer: White Nationalism's Deep American Roots - The Atlantic - 0 views

  • The concept of “white genocide”—extinction under an onslaught of genetically or culturally inferior nonwhite interlopers—may indeed seem like a fringe conspiracy theory with an alien lineage, the province of neo-Nazis and their fellow travelers. In popular memory, it’s a vestige of a racist ideology that the Greatest Generation did its best to scour from the Earth.
  • History, though, tells a different story.
  • King’s recent question, posed in a New York Times interview, may be appalling: “White nationalist, white supremacist, Western civilization—how did that language become offensive?” But it is apt. “That language” has an American past in need of excavation. Without such an effort, we may fail to appreciate the tenacity of the dogma it expresses, and the difficulty of eradicating it.
  • ...45 more annotations...
  • “Even though the Germans had been directly influenced by Madison Grant and the American eugenics movement, when we fought Germany, because Germany was racist, racism became unacceptable in America. Our enemy was racist; therefore we adopted antiracism as our creed.” Ever since, a strange kind of historical amnesia has obscured the American lineage of this white-nationalist ideology.
  • What is judged extremist today was once the consensus of a powerful cadre of the American elite, well-connected men who eagerly seized on a false doctrine of “race suicide” during the immigration scare of the early 20th century. They included wealthy patricians, intellectuals, lawmakers, even several presidents.
  • Madison Grant. He was the author of a 1916 book called The Passing of the Great Race, which spread the doctrine of race purity all over the globe.
  • Grant’s purportedly scientific argument that the exalted “Nordic” race that had founded America was in peril, and all of modern society’s accomplishments along with it, helped catalyze nativist legislators in Congress to pass comprehensive restrictionist immigration policies in the early 1920s. His book went on to become Adolf Hitler’s “bible,” as the führer wrote to tell him
  • Grant’s doctrine has since been rejuvenated and rebranded by his ideological descendants as “white genocide
  • The cross between a white man and an Indian is an Indian; the cross between a white man and a Negro is a Negro; the cross between a white man and a Hindu is a Hindu; and the cross between any of the three European races and a Jew is a Jew.
  • When Nazism reflected back that vision in grotesque form, wartime denial set in.
  • In 1853, across the Atlantic, Joseph Arthur de Gobineau, a French count, first identified the “Aryan” race as “great, noble, and fruitful in the works of man on this earth.”
  • In 1899, William Z. Ripley, an economist, concluded that Europeans consisted of “three races”: the brave, beautiful, blond “Teutons”; the stocky “Alpines”; and the swarthy “Mediterraneans.”
  • Another leading academic contributor to race science in turn-of-the-century America was a statistician named Francis Walker, who argued in The Atlantic that the new immigrants lacked the pioneer spirit of their predecessors; they were made up of “beaten men from beaten races,” whose offspring were crowding out the fine “native” stock of white people.
  • In 1901 the sociologist Edward A. Ross, who similarly described the new immigrants as “masses of fecund but beaten humanity from the hovels of far Lombardy and Galicia,” coined the term race suicide.
  • it was Grant who synthesized these separate strands of thought into one pseudo-scholarly work that changed the course of the nation’s history. In a nod to wartime politics, he referred to Ripley’s “Teutons” as “Nordics,” thereby denying America’s hated World War I rivals exclusive claim to descent from the world’s master race. He singled out Jews as a source of anxiety disproportionate to their numbers
  • The historian Nell Irvin Painter sums up the race chauvinists’ view in The History of White People (2010): “Jews manipulate the ignorant working masses—whether Alpine, Under-Man, or colored.
  • In The Passing of the Great Race, the eugenic focus on winnowing out unfit individuals made way for a more sweeping crusade to defend against contagion by inferior races. By Grant’s logic, infection meant obliteration:
  • The seed of Nazism’s ultimate objective—the preservation of a pure white race, uncontaminated by foreign blood—was in fact sown with striking success in the United States.
  • Grant, emphasizing the American experience in particular, agreed. In The Passing of the Great Race, he had argued that
  • Teddy Roosevelt, by then out of office, told Grant in 1916 that his book showed “fine fearlessness in assailing the popular and mischievous sentimentalities and attractive and corroding falsehoods which few men dare assail.”
  • President Warren Harding publicly praised one of Grant’s disciples, Lothrop Stoddard, whose book The Rising Tide of Color Against White World-Supremacy offered similar warnings about the destruction of white society by invading dusky hordes. There is “a fundamental, eternal, inescapable difference” between the races, Harding told his audience. “Racial amalgamation there cannot be.
  • Calvin Coolidge, found Grant’s thesis equally compelling. “There are racial considerations too grave to be brushed aside for any sentimental reasons. Biological laws tell us that certain divergent people will not mix or blend,” Coolidge wrote in a 1921 article in Good Housekeeping.The Nordics propagate themselves successfully. With other races, the outcome shows deterioration on both sides. Quality of mind and body suggests that observance of ethnic law is as great a necessity to a nation as immigration law.
  • On Capitol Hill debate raged, yet Republicans and Democrats were converging on the idea that America was a white man’s country, and must stay that way. The influx of foreigners diluted the nation with inferiors unfit for self-government, many politicians in both parties energetically concurred. The Supreme Court chimed in with decisions in a series of cases, beginning in 1901, that assigned the status of “nationals” rather than “citizens” to colonial newcomers.
  • A popular myth of American history is that racism is the exclusive province of the South. The truth is that much of the nativist energy in the U.S. came from old-money elites in the Northeast, and was also fueled by labor struggles in the Pacific Northwest, which had stirred a wave of bigotry that led to the Chinese Exclusion Act of 1882
  • In 1917, overriding President Woodrow Wilson’s veto, Congress passed a law that banned immigration not just from Asian but also from Middle Eastern countries and imposed a literacy test on new immigrants
  • When the Republicans took control of the House in 1919, Johnson became chair of the committee on immigration, “thanks to some shrewd lobbying by the Immigration Restriction League,” Spiro writes. Grant introduced him to a preeminent eugenicist named Harry Laughlin, whom Johnson named the committee’s “expert eugenics agent.” His appointment helped ensure that Grantian concerns about “race suicide” would be a driving force in a quest that culminated, half a decade later, in the Immigration Act of 1924.
  • Meanwhile, the Supreme Court was struggling mightily to define whiteness in a consistent fashion, an endeavor complicated by the empirical flimsiness of race science. In one case after another, the high court faced the task of essentially tailoring its definition to exclude those whom white elites considered unworthy of full citizenship.
  • In 1923, when an Indian veteran named Bhagat Singh Thind—who had fought for the U.S. in World War I—came before the justices with the claim of being Caucasian in the scientific sense of the term, and therefore entitled to the privileges of whiteness, they threw up their hands. In a unanimous ruling against Thind (who was ultimately made a citizen in 1936), Justice George Sutherland wrote:What we now hold is that the words “free white persons” are words of common speech to be interpreted in accordance with the understanding of the common man, synonymous with the word “Caucasian” only as that word is popularly understood.The justices had unwittingly acknowledged a consistent truth about racism, which is that race is whatever those in power say it is.
  • Grant felt his life’s work had come to fruition and, according to Spiro, he concluded, “We have closed the doors just in time to prevent our Nordic population being overrun by the lower races.” Senator Reed announced in a New York Times op-ed, “The racial composition of America at the present time thus is made permanent.” Three years later, in 1927, Johnson held forth in dire but confident tones in a foreword to a book about immigration restriction. “Our capacity to maintain our cherished institutions stands diluted by a stream of alien blood, with all its inherited misconceptions respecting the relationships of the governing power to the governed,” he warned. “The United States is our land … We intend to maintain it so. The day of unalloyed welcome to all peoples, the day of indiscriminate acceptance of all races, has definitely ended.”
  • t was America that taught us a nation should not open its doors equally to all nations,” Adolf Hitler told The New York Times half a decade later, just one year before his elevation to chancellor in January 1933. Elsewhere he admiringly noted that the U.S. “simply excludes the immigration of certain races. In these respects America already pays obeisance, at least in tentative first steps, to the characteristic völkisch conception of the state.”
  • Harry Laughlin, the scientific expert on Representative Johnson’s committee, told Grant that the Nazis’ rhetoric sounds “exactly as though spoken by a perfectly good American eugenist,” and wrote that “Hitler should be made honorary member of the Eugenics Research Association.”
  • What the Nazis “found exciting about the American model didn’t involve just eugenics,
  • “It also involved the systematic degradation of Jim Crow, of American deprivation of basic rights of citizenship like voting.”
  • Nazi lawyers carefully studied how the United States, despite its pretense of equal citizenship, had effectively denied that status to those who were not white. They looked at Supreme Court decisions that withheld full citizenship rights from nonwhite subjects in U.S. colonial territories. They examined cases that drew, as Thind’s had, arbitrary but hard lines around who could be considered “white.
  • Krieger, whom Whitman describes as “the single most important figure in the Nazi assimilation of American race law,” considered the Fourteenth Amendment a problem: In his view, it codified an abstract ideal of equality at odds with human experience, and with the type of country most Americans wanted to live in.
  • He blended Nordic boosterism with fearmongering, and supplied a scholarly veneer for notions many white citizens already wanted to believe
  • it has taken us fifty years to learn that speaking English, wearing good clothes and going to school and to church do not transform a Negro into a white man.
  • The authors of the Fourteenth Amendment, he believed, had failed to see a greater truth as they made good on the promise of the Declaration of Independence that all men are created equal: The white man is more equal than the others.
  • two “rival principles of national unity.” According to one, the U.S. is the champion of the poor and the dispossessed, a nation that draws its strength from its pluralism. According to the other, America’s greatness is the result of its white and Christian origins, the erosion of which spells doom for the national experiment.
  • Grantism, despite its swift wartime eclipse, did not become extinct. The Nazis, initially puzzled by U.S. hostility, underestimated the American commitment to democracy.
  • the South remained hawkish toward Nazi Germany because white supremacists in the U.S. didn’t want to live under a fascist government. What they wanted was a herrenvolk democracy, in which white people were free and full citizens but nonwhites were not.
  • The Nazis failed to appreciate the significance of that ideological tension. They saw allegiance to the American creed as a weakness. But U.S. soldiers of all backgrounds and faiths fought to defend it, and demanded that their country live up to it
  • historical amnesia, the excision of the memory of how the seed of racism in America blossomed into the Third Reich in Europe, has allowed Grantism to be resurrected with a new name
  • Grant’s philosophical framework has found new life among extremists at home and abroad, and echoes of his rhetoric can be heard from the Republican base and the conservative media figures the base trusts, as well as—once again—in the highest reaches of government.
  • The resurrection of race suicide as white genocide can be traced to the white supremacist David Lane, who claimed that “the term ‘racial integration’ is only a euphemism for genocide,” and whose infamous “fourteen words” manifesto, published in the 1990s, distills his credo: “We must secure the existence of our people and a future for white children.” Far-right intellectuals in Europe speak of “the great replacement” of Europeans by nonwhite immigrants and refugees.
  • That nations make decisions about appropriate levels of immigration is not inherently evil or fascist. Nor does the return of Grantian ideas to mainstream political discourse signal an inevitable march to Holocaust-level crimes against humanity.
  • The most benignly intentioned mainstream-media coverage of demographic change in the U.S. has a tendency to portray as justified the fear and anger of white Americans who believe their political power is threatened by immigration—as though the political views of today’s newcomers were determined by genetic inheritance rather than persuasion.
  • The danger of Grantism, and its implications for both America and the world, is very real. External forces have rarely been the gravest threat to the social order and political foundations of the United States. Rather, the source of greatest danger has been those who would choose white purity over a diverse democracy.
20More

Will Florida be lost forever to the climate crisis? | Environment | The Guardian - 0 views

  • Few places on the planet are more at risk from the climate crisis than south Florida, where more than 8 million residents are affected by the convergence of almost every modern environmental challenge – from rising seas to contaminated drinking water, more frequent and powerful hurricanes, coastal erosion, flooding and vanishing wildlife and habitat.
  • Below are some of the biggest threats posed by the climate crisis to south Florida today, along with solutions under consideration. Some of these solutions will have a lasting impact on the fight. Others, in many cases, are only delaying the inevitable. But in every situation, doing something is preferable to doing nothing at all.
  • Sea level rise The threat: By any estimation, Florida is drowning. In some scenarios, sea levels will rise up to 31in by 2060, a devastating prediction for a region that already deals regularly with tidal flooding and where an estimated 120,000 properties on or near the water are at risk. The pace of the rise is also hastening, scientists say – it took 31 years for the waters around Miami to rise by six inches, while the next six inches will take only 15 more.
  • ...17 more annotations...
  • The cost: The participating counties and municipalities are contributing to a $4bn statewide spend, including Miami Beach’s $400m Forever Bond, a $1bn stormwater plan and $250m of improvements to Broward county’s sewage systems to protect against flooding and seawater seepage. In the Keys, many consider the estimated $60m a mile cost of raising roads too expensive.
  • The threat: Saltwater from sea level rise is seeping further inland through Florida’s porous limestone bedrock and contaminating underground freshwater supplies, notably in the Biscayne aquifer, the 4,000-sq mile shallow limestone basin that provides drinking water to millions in southern Florida. Years of over-pumping and toxic runoff from farming and the sugar industry in central Florida and the Everglades have worsened the situation. The Florida department of environmental protection warned in March that “existing sources of water will not adequately meet the reasonable beneficial needs for the next 20 years”. A rising water table, meanwhile, has exacerbated problems with south Florida’s ageing sewage systems. Since December, millions of gallons of toxic, raw sewage have spilled on to Fort Lauderdale’s streets from a series of pipe failures.
  • The cost: The Everglades restoration plan was originally priced at $7.8bn, rose to $10.5bn, and has since ballooned to $16.4bn. Donald Trump’s proposed 2021 federal budget includes $250m for Everglades restoration. The estimated $1.8bn cost of the reservoir will be split between federal and state budgets.
  • Possible solutions
  • The cost: With homeowners and businesses largely bearing their own costs, the specific amount spent on “hurricane-proofing” in Florida is impossible to know. A 2018 Pew research study documented $1.3bn in hazard mitigation grants from federal and state funding in 2017, along with a further $8bn in post-disaster grants. Florida is spending another $633m from the US Department of Housing and Urban Development on resiliency planning.
  • Wildlife and habitat loss The threat: Florida’s native flora and fauna are being devastated by climate change, with the Florida Natural Areas Inventory warning that a quarter of the 1,200 species it tracks is set to lose more than half their existing habitat, and the state’s beloved manatees and Key deer are at risk of extinction. Warmer and more acidic seas reduce other species’ food stocks and exacerbate the deadly red-tide algal blooms that have killed incalculable numbers of fish, turtles, dolphins and other marine life. Bleaching and stony coral tissue disease linked to the climate crisis threaten to hasten the demise of the Great Florida Reef, the only living coral reef in the continental US. Encroaching saltwater has turned Big Pine Key, a crucial deer habitat, into a ghost forest.
  • As for the Key deer, of which fewer than 1,000 remain, volunteers leave clean drinking water to replace salt-contaminated watering holes as herds retreat to higher ground. A longer-term debate is under way on the merits and ethics of relocating the species to other areas of Florida or the US.
  • Coastal erosion The threat: Tourist brochures showcase miles of golden, sandy beaches in South Florida, but the reality is somewhat different. The Florida department of environmental protection deems the entire coastline from Miami to Cape Canaveral “critically eroded”, the result of sea level rise, historically high tides and especially storm surges from a succession of powerful hurricanes. In south-eastern Florida’s Palm Beach, Broward, Miami-Dade and Monroe counties, authorities are waging a continuous war on sand loss, eager to maintain their picture-perfect image and protect two of their biggest sources of income, tourism dollars and lucrative property taxes from waterfront homes and businesses.
  • In the devastating hurricane season just one year before, major storms named Harvey, Maria and Irma combined to cause damage estimated at $265bn. Scientists have evidence the climate crisis is causing cyclones to be more powerful, and intensify more quickly, and Florida’s position at the end of the Atlantic Ocean’s “hurricane alley” makes it twice as vulnerable as any other state.
  • With the other option abandoning beaches to the elements, city and county commissions have little choice but costly replenishment projects with sand replacement and jetty construction. Federal law prohibits the importation of cheaper foreign sand, so the municipalities must source a more expensive alternative from US markets, often creating friction with residents who don’t want to part with their sand. Supplementary to sand replenishment, the Nature Conservancy is a partner in a number of nature-based coastal defense projects from West Palm Beach to Miami.
  • benefited from 61,000 cubic yards of new sand this year at a cost of $16m. Statewide, Florida spends an average $50m annually on beach erosion.
  • The threat: “Climate gentrification” is a buzzword around south Florida, a region barely 6ft above sea level where land has become increasingly valuable in elevated areas. Speculators and developers are eyeing historically black, working-class and poorer areas, pushing out long-term residents and replacing affordable housing with upscale developments and luxury accommodations that only the wealthy can afford.
  • No study has yet calculated the overall cost of affordable housing lost to the climate crisis. Private developers will bear the expense of mitigating the impact on the neighborhood – $31m in Magic City’s case over 15 years to the Little Haiti Revitalization Trust, largely for new “green” affordable housing. The University of Miami’s housing solutions lab has a $300,000 grant from JPMorgan to report on the impact of rising seas to South Florida’s affordable housing stocks and recommend modifications to prevent it from flooding and other climate events. A collaboration of not-for-profit groups is chasing $75m in corporate funding for affordable housing along the 70-mile south Florida rail trail from Miami to West Palm Beach, with the first stage, a $5m project under way to identify, build and renovate 300 units.
  • Florida has long been plagued by political leadership more in thrall to the interests of big industry than the environment. As governor from 2011 to 2019, Rick Scott, now a US senator, slashed $700m from Florida’s water management budget, rolled back environmental regulations and enforcement, gave a free ride to polluters, and flip-flopped over expanding offshore oil drilling. The politician who came to be known as “Red Tide Rick”, for his perceived inaction over 2018’s toxic algae bloom outbreaks, reportedly banned the words “climate change” and “global warming” from state documents.
  • Last month, state legislators approved the first dedicated climate bill. It appears a promising start for a new administration, but activists say more needs to be done. In January, the Sierra Club awarded DeSantis failing grades in an environmental report card, saying he failed to protect Florida’s springs and rivers and approved new roads that threatened protected wildlife.
  • The cost: Florida’s spending on the environment is increasing. The state budget passed last month included $650m for Everglades restoration and water management projects (an instalment of DeSantis’s $2.5bn four-year pledge) and $100m for Florida Forever. A $100m bridge project jointly funded by the state and federal governments will allow the free flow of water under the Tamiami Trail for the first time in decades.
  • Florida has woken up to the threat of climate change but it is not yet clear how effective the response will be. The challenges are innumerable, the costs immense and the political will to fix or minimize the issues remains questionable, despite recent progress. At stake is the very future of one of the largest and most diverse states in the nation, in terms of both its population and its environment. Action taken now will determine its survival.
5More

Opinion | Native Americans Paid for America's Land-Grant Universities - The New York Times - 0 views

  • “Even as Confederate victories in Virginia raised doubts about the future of the Union, Congress and President Abraham Lincoln kept their eyes on the horizon, enacting three landmark laws that shaped the nation’s next chapter.”
  • Among those laws was the Morrill Act of 1862, which appropriated land to fund agricultural and mechanical colleges — a national constellation of institutions known as land-grant universities. A graduate of Montana State University went on to develop vaccines; researchers at Iowa State bred the key corn variety in our food supply; the first email system was developed at M.I.T.
  • The Morrill Act was a wealth transfer disguised as a donation. The government took land from Indigenous people that it had paid little or nothing for and turned that land into endowments for fledgling universities.
  • ...2 more annotations...
  • An investigation we did for High Country News found that the act redistributed nearly 11 million acres, which is almost the size of Denmark. The grants came from more than 160 violence-backed land cessions made by close to 250 tribal nations. When adjusted for inflation, the windfall netted 52 universities roughly half a billion dollars.
  • A cleareyed history of how land-grant universities profited from violence and expropriation can provide a starting point to confront the nation’s record of genocide.
17More

How the Civil War Became the Indian Wars - NYTimes.com - 0 views

  • On Dec. 21, 1866, a year and a half after Gen. Robert E. Lee and Gen. Ulysses S. Grant ostensibly closed the book on the Civil War’s final chapter at Appomattox Court House, another soldier, Capt. William Fetterman, led cavalrymen from Fort Phil Kearny, a federal outpost in Wyoming, toward the base of the Big Horn range
  • The Civil War was over, but the Indian wars were just beginning.
  • These two conflicts, long segregated in history and memory, were in fact intertwined. They both grew out of the process of establishing an American empire in the West. In 1860, competing visions of expansion transformed the presidential election into a referendum. Members of the Republican Party hearkened back to Jefferson’s dream of an “empire for liberty.” The United States, they said, should move west, leaving slavery behind. This free soil platform stood opposite the splintered Democrats’ insistence that slavery, unfettered by federal regulations, should be allowed to root itself in new soil.
  • ...14 more annotations...
  • Never ones to let a serious crisis go to waste, leading Republicans seized the ensuing constitutional crisis as an opportunity to remake the nation’s political economy and geography. In the summer of 1862, as Lincoln mulled over the Emancipation Proclamation’s details, officials in his administration created the Department of Agriculture, while Congress passed the Morrill Land Grant Act, the Pacific Railroad Act and the Homestead Act.
  • s a result, federal authorities could offer citizens a deal: Enlist to fight for Lincoln and liberty, and receive, as fair recompense for their patriotic sacrifices, higher education and Western land connected by rail to markets. It seemed possible that liberty and empire might advance in lock step.
  • the project of continental expansion fostered sectional reconciliation. Northerners and Southerners agreed on little at the time except that the Army should pacify Western tribes. Even as they fought over the proper role for the federal government, the rights of the states, and the prerogatives of citizenship, many Americans found rare common ground on the subject of Manifest Destiny.
  • many American soldiers, whether they had fought for the Union or the Confederacy, redeployed to the frontier. They became shock troops of empire. The federal project of demilitarization, paradoxically, accelerated the conquest and colonization of the West.
  • The Indian wars of the Reconstruction era devastated not just Native American nations but also the United States.
  • For a moment, it seemed that the federal government could accomplish great things. But in the West, Native Americans would not simply vanish
  • Red Cloud’s War, then, undermined a utopian moment and blurred the Republican Party’s vision for expansion
  • at least the Grant administration had a plan. After he took office in 1869, President Grant promised that he would pursue a “peace policy” to put an end to violence in the West, opening the region to settlers. By feeding rather than fighting Indians, federal authorities would avoid further bloodshed with the nation’s indigenous peoples. The process of civilizing and acculturating Native nations into the United States could begin.
  • President Grant’s Peace Policy perished in the Modoc War. The horror of that conflict, and the Indian wars more broadly, coupled with an endless array of political scandals and violence in the states of the former Confederacy – including the brutal murder, on Easter Sunday 1873 in Colfax, La., of at least 60 African-Americans – diminished support for the Grant administration’s initiatives in the South and the West.
  • One hundred and fifty years after the Civil War, collective memory casts that conflict as a war of liberation, entirely distinct from the Indian wars.
  • though Reconstruction is typically recalled in the popular imagination as both more convoluted and contested – whether thwarted by intransigent Southerners, doomed to fail by incompetent and overweening federal officials, or perhaps some combination of the two – it was well intended nevertheless, an effort to make good on the nation’s commitment to freedom and equality.
  • But this is only part of the story. The Civil War emerged out of struggles between the North and South over how best to settle the West – struggles, in short, over who would shape an emerging American empire. Reconstruction in the West then devolved into a series of conflicts with Native Americans
  • so, while the Civil War and its aftermath boasted moments of redemption and days of jubilee, the era also featured episodes of subjugation and dispossession, patterns that would repeat themselves in the coming years.
  • When Chief Joseph surrendered, the United States secured its empire in the West. The Indian wars were over, but an era of American imperialism was just beginning.
300More

12 Rules for Life: An Antidote to Chaos (Jordan B. Peterson) - 0 views

  • RULES? MORE RULES? REALLY? Isn’t life complicated enough, restricting enough, without abstract rules that don’t take our unique, individual situations into account? And given that our brains are plastic, and all develop differently based on our life experiences, why even expect that a few rules might be helpful to us all?
  • “I’ve got some good news…and I’ve got some bad news,” the lawgiver yells to them. “Which do you want first?” “The good news!” the hedonists reply. “I got Him from fifteen commandments down to ten!” “Hallelujah!” cries the unruly crowd. “And the bad?” “Adultery is still in.”
  • Maps of Meaning was sparked by Jordan’s agonized awareness, as a teenager growing up in the midst of the Cold War, that much of mankind seemed on the verge of blowing up the planet to defend their various identities. He felt he had to understand how it could be that people would sacrifice everything for an “identity,”
  • ...297 more annotations...
  • the story of the golden calf also reminds us that without rules we quickly become slaves to our passions—and there’s nothing freeing about that.
  • And the story suggests something more: unchaperoned, and left to our own untutored judgment, we are quick to aim low and worship qualities that are beneath us—in this case, an artificial animal that brings out our own animal instincts in a completely unregulated way.
  • Similarly, in this book Professor Peterson doesn’t just propose his twelve rules, he tells stories, too, bringing to bear his knowledge of many fields as he illustrates and explains why the best rules do not ultimately restrict us but instead facilitate our goals and make for fuller, freer lives.
  • Peterson wasn’t really an “eccentric”; he had sufficient conventional chops, had been a Harvard professor, was a gentleman (as cowboys can be) though he did say damn and bloody a lot, in a rural 1950s sort of way. But everyone listened, with fascination on their faces, because he was in fact addressing questions of concern to everyone at the table.
  • unlike many academics who take the floor and hold it, if someone challenged or corrected him he really seemed to like it. He didn’t rear up and neigh. He’d say, in a kind of folksy way, “Yeah,” and bow his head involuntarily, wag it if he had overlooked something, laughing at himself for overgeneralizing. He appreciated being shown another side of an issue, and it became clear that thinking through a problem was, for him, a dialogic process.
  • for an egghead Peterson was extremely practical. His examples were filled with applications to everyday life: business management, how to make furniture (he made much of his own), designing a simple house, making a room beautiful (now an internet meme) or in another, specific case related to education, creating an online writing project that kept minority students from dropping out of school by getting them to do a kind of psychoanalytic exercise on themselves,
  • These Westerners were different: self-made, unentitled, hands on, neighbourly and less precious than many of their big-city peers, who increasingly spend their lives indoors, manipulating symbols on computers. This cowboy psychologist seemed to care about a thought only if it might, in some way, be helpful to someone.
  • I was drawn to him because here was a clinician who also had given himself a great books education, and who not only loved soulful Russian novels, philosophy and ancient mythology, but who also seemed to treat them as his most treasured inheritance. But he also did illuminating statistical research on personality and temperament, and had studied neuroscience. Though trained as a behaviourist, he was powerfully drawn to psychoanalysis with its focus on dreams, archetypes, the persistence of childhood conflicts in the adult, and the role of defences and rationalization in everyday life. He was also an outlier in being the only member of the research-oriented Department of Psychology at the University of Toronto who also kept a clinical practice.
  • Maps of Meaning, published nearly two decades ago, shows Jordan’s wide-ranging approach to understanding how human beings and the human brain deal with the archetypal situation that arises whenever we, in our daily lives, must face something we do not understand.
  • The brilliance of the book is in his demonstration of how rooted this situation is in evolution, our DNA, our brains and our most ancient stories. And he shows that these stories have survived because they still provide guidance in dealing with uncertainty, and the unavoidable unknown.
  • this is why many of the rules in this book, being based on Maps of Meaning, have an element of universality to them.
  • We are ambivalent about rules, even when we know they are good for us. If we are spirited souls, if we have character, rules seem restrictive, an affront to our sense of agency and our pride in working out our own lives. Why should we be judged according to another’s rule?
  • And he felt he had to understand the ideologies that drove totalitarian regimes to a variant of that same behaviour: killing their own citizens.
  • Ideologies are simple ideas, disguised as science or philosophy, that purport to explain the complexity of the world and offer remedies that will perfect it.
  • Ideologues are people who pretend they know how to “make the world a better place” before they’ve taken care of their own chaos within.
  • Ideologies are substitutes for true knowledge, and ideologues are always dangerous when they come to power, because a simple-minded I-know-it-all approach is no match for the complexity of existence.
  • To understand ideology, Jordan read extensively about not only the Soviet gulag, but also the Holocaust and the rise of Nazism. I had never before met a person, born Christian and of my generation, who was so utterly tormented by what happened in Europe to the Jews, and who had worked so hard to understand how it could have occurred.
  • I saw what now millions have seen online: a brilliant, often dazzling public speaker who was at his best riffing like a jazz artist; at times he resembled an ardent Prairie preacher (not in evangelizing, but in his passion, in his ability to tell stories that convey the life-stakes that go with believing or disbelieving various ideas). Then he’d just as easily switch to do a breathtakingly systematic summary of a series of scientific studies. He was a master at helping students become more reflective, and take themselves and their futures seriously. He taught them to respect many of the greatest books ever written. He gave vivid examples from clinical practice, was (appropriately) self-revealing, even of his own vulnerabilities, and made fascinating links between evolution, the brain and religious stories.
  • Above all, he alerted his students to topics rarely discussed in university, such as the simple fact that all the ancients, from Buddha to the biblical authors, knew what every slightly worn-out adult knows, that life is suffering.
  • chances are, if you or someone you love is not suffering now, they will be within five years, unless you are freakishly lucky. Rearing kids is hard, work is hard, aging, sickness and death are hard, and Jordan emphasized that doing all that totally on your own, without the benefit of a loving relationship, or wisdom, or the psychological insights of the greatest psychologists, only makes it harder.
  • focused on triumphant heroes. In all these triumph stories, the hero has to go into the unknown, into an unexplored territory, and deal with a new great challenge and take great risks. In the process, something of himself has to die, or be given up, so he can be reborn and meet the challenge. This requires courage, something rarely discussed in a psychology class or textbook.
  • Jordan
  • views of his first YouTube statements quickly numbered in the hundreds of thousands. But people have kept listening because what he is saying meets a deep and unarticulated need. And that is because alongside our wish to be free of rules, we all search for structure.
  • the first generation to have been so thoroughly taught two seemingly contradictory ideas about morality, simultaneously—at their schools, colleges and universities, by many in my own generation. This contradiction has left them at times disoriented and uncertain, without guidance and, more tragically, deprived of riches they don’t even know exist.
  • morality and the rules associated with it are just a matter of personal opinion or happenstance, “relative to” or “related to” a particular framework, such as one’s ethnicity, one’s upbringing, or the culture or historical…
  • The first idea or teaching is that morality is relative, at best a…
  • So, the decent thing to do—once it becomes apparent how arbitrary your, and your society’s, “moral values” are—is to show tolerance for people who think differently, and…
  • for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an adult can…
  • That emphasis on tolerance is so paramount that for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an…
  • And so a generation has been raised untutored in what was once called, aptly, “practical wisdom,” which guided previous generations. Millennials, often told they have received the finest education available anywhere, have actually…
  • professors, chose to devalue thousands of years of human knowledge about how to acquire virtue, dismissing it as passé, “…
  • They were so successful at it that the very word “virtue” sounds out of date, and someone using it appears…
  • The study of virtue is not quite the same as the study of morals (right and wrong, good and evil). Aristotle defined the virtues simply as the ways of behaving that are most conducive to happiness in life. Vice was…
  • Cultivating judgment about the difference between virtue and vice is the beginning of wisdom, something…
  • By contrast, our modern relativism begins by asserting that making judgments about how to live is impossible, because there is no real good, and no…
  • Thus relativism’s closest approximation to “virtue” is “tolerance.” Only tolerance will provide social cohesion between different groups, and save us from harming each other. On Facebook and other forms of social media, therefore, you signal your so-called…
  • Intolerance of others’ views (no matter how ignorant or incoherent they may be) is not simply wrong; in a world where there is no right or wrong, it is worse: it is a sign you are…
  • But it turns out that many people cannot tolerate the vacuum—the chaos—which is inherent in life, but made worse by this moral relativism; they cannot live without a moral compass,…
  • So, right alongside relativism, we find the spread of nihilism and despair, and also the opposite of moral relativism: the blind certainty offered by ideologies…
  • Dr. Norman Doidge, MD, is the author of The Brain That Changes Itself
  • so we arrive at the second teaching that millennials have been bombarded with. They sign up for a humanities course, to study the greatest books ever written. But they’re not assigned the books; instead they are given…
  • (But the idea that we can easily separate facts and values was and remains naive; to some extent, one’s values determine what one will pay…
  • For the ancients, the discovery that different people have different ideas about how, practically, to live, did not paralyze them; it deepened their understanding of humanity and led to some of the most satisfying conversations human beings have ever had, about how life might be lived.
  • Modern moral relativism has many sources. As we in the West learned more history, we understood that different epochs had different moral codes. As we travelled the seas and explored the globe, we learned of far-flung tribes on different continents whose different moral codes made sense relative to, or within the framework of, their societies. Science played a role, too, by attacking the religious view of the world, and thus undermining the religious grounds for ethics and rules. Materialist social science implied that we could divide the world into facts (which all could observe, and were objective and “real”) and values (…
  • it seems that all human beings are, by some kind of biological endowment, so ineradicably concerned with morality that we create a structure of laws and rules wherever we are. The idea that human life can be free of moral concerns is a fantasy.
  • given that we are moral animals, what must be the effect of our simplistic modern relativism upon us? It means we are hobbling ourselves by pretending to be something we are not. It is a mask, but a strange one, for it mostly deceives the one who wears it.
  • Far better to integrate the best of what we are now learning with the books human beings saw fit to preserve over millennia, and with the stories that have survived, against all odds, time’s tendency to obliterate.
  • these really are rules. And the foremost rule is that you must take responsibility for your own life. Period.
  • Jordan’s message that each individual has ultimate responsibility to bear; that if one wants to live a full life, one first sets one’s own house in order; and only then can one sensibly aim to take on bigger responsibilities.
  • if it’s uncertain that our ideals are attainable, why do we bother reaching in the first place? Because if you don’t reach for them, it is certain you will never feel that your life has meaning.
  • And perhaps because, as unfamiliar and strange as it sounds, in the deepest part of our psyche, we all want to be judged.
  • Instead of despairing about these differences in moral codes, Aristotle argued that though specific rules, laws and customs differed from place to place, what does not differ is that in all places human beings, by their nature, have a proclivity to make rules, laws and customs.
  • Freud never argued (as do some who want all culture to become one huge group therapy session) that one can live one’s entire life without ever making judgments, or without morality. In fact, his point in Civilization and Its Discontents is that civilization only arises when some restraining rules and morality are in place.
  • Aleksandr Solzhenitsyn, the great documenter of the slave-labour-camp horrors of the latter, once wrote that the “pitiful ideology” holding that “human beings are created for happiness” was an ideology “done in by the first blow of the work assigner’s cudgel.”1 In a crisis, the inevitable suffering that life entails can rapidly make a mockery of the idea that happiness is the proper pursuit of the individual. On the radio show, I suggested, instead, that a deeper meaning was required. I noted that the nature of such meaning was constantly re-presented in the great stories of the past, and that it had more to do with developing character in the face of suffering than with happiness.
  • I proposed in Maps of Meaning that the great myths and religious stories of the past, particularly those derived from an earlier, oral tradition, were moral in their intent, rather than descriptive. Thus, they did not concern themselves with what the world was, as a scientist might have it, but with how a human being should act.
  • I suggested that our ancestors portrayed the world as a stage—a drama—instead of a place of objects. I described how I had come
  • to believe that the constituent elements of the world as drama were order and chaos, and not material things.
  • Order is where the people around you act according to well-understood social norms, and remain predictable and cooperative. It’s the world of social structure, explored territory, and familiarity. The state of Order is typically portrayed, symbolically—imaginatively—as masculine.
  • Chaos, by contrast, is where—or when—something unexpected happens.
  • As the antithesis of symbolically masculine order, it’s presented imaginatively as feminine. It’s the new and unpredictable suddenly emerging in the midst of the commonplace familiar. It’s Creation and Destruction,
  • Order is the white, masculine serpent; Chaos, its black, feminine counterpart. The black dot in the white—and the white in the black—indicate the possibility of transformation: just when things seem secure, the unknown can loom, unexpectedly and large. Conversely, just when everything seems lost, new order can emerge from catastrophe and chaos.
  • For the Taoists, meaning is to be found on the border between the ever-entwined pair. To walk that border is to stay on the path of life, the divine Way. And that’s much better than happiness.
  • trying to address a perplexing problem: the reason or reasons for the nuclear standoff of the Cold War. I couldn’t understand how belief systems could be so important to people that they were willing to risk the destruction of the world to protect them. I came to realize that shared belief systems made people intelligible to one another—and that the systems weren’t just about belief.
  • People who live by the same code are rendered mutually predictable to one another. They act in keeping with each other’s expectations and desires. They can cooperate. They can even compete peacefully, because everyone knows what to expect from everyone else.
  • Shared beliefs simplify the world, as well, because people who know what to expect from one another can act together to tame the world. There is perhaps nothing more important than the maintenance of this organization—this simplification. If it’s threatened, the great ship of state rocks.
  • It isn’t precisely that people will fight for what they believe. They will fight, instead, to maintain the match between what they believe, what they expect, and what they desire. They will fight to maintain the match between what they expect and how everyone is acting. It is precisely the maintenance of that match that enables everyone
  • There’s more to it, too. A shared cultural system stabilizes human interaction, but is also a system of value—a hierarchy of value, where some things are given priority and importance and others are not. In the absence of such a system of value, people simply cannot act. In fact, they can’t even perceive, because both action and perception require a goal, and a valid goal is, by necessity, something valued.
  • We experience much of our positive emotion in relation to goals. We are not happy, technically speaking, unless we see ourselves progressing—and the very idea of progression implies value.
  • Worse yet is the fact that the meaning of life without positive value is not simply neutral. Because we are vulnerable and mortal, pain and anxiety are an integral part of human existence. We must have something to set against the suffering that is intrinsic to Being.*2 We must have the meaning inherent in a profound system of value or the horror of existence rapidly becomes paramount. Then, nihilism beckons, with its hopelessness and despair.
  • So: no value, no meaning. Between value systems, however, there is the possibility of conflict. We are thus eternally caught between the most diamantine rock and the hardest of places:
  • loss of group-centred belief renders life chaotic, miserable, intolerable; presence of group-centred belief makes conflict with other groups inevitable.
  • In the West, we have been withdrawing from our tradition-, religion- and even nation-centred cultures, partly to decrease the danger of group conflict. But we are increasingly falling prey to the desperation of meaninglessness, and that is no improvement at all.
  • While writing Maps of Meaning, I was (also) driven by the realization that we can no longer afford conflict—certainly not on the scale of the world conflagrations of the twentieth century.
  • I came to a more complete, personal realization of what the great stories of the past continually insist upon: the centre is occupied by the individual.
  • It is possible to transcend slavish adherence to the group and its doctrines and, simultaneously, to avoid the pitfalls of its opposite extreme, nihilism. It is possible, instead, to find sufficient meaning in individual consciousness and experience.
  • How could the world be freed from the terrible dilemma of conflict, on the one hand, and psychological and social dissolution, on the other? The answer was this: through the elevation and development of the individual, and through the willingness of everyone to shoulder the burden of Being and to take the heroic path. We must each adopt as much responsibility as possible for individual life, society and the world.
  • We must each tell the truth and repair what is in disrepair and break down and recreate what is old and outdated. It is in this manner that we can and must reduce the suffering that poisons the world. It’s asking a lot. It’s asking for everything.
  • the alternative—the horror of authoritarian belief, the chaos of the collapsed state, the tragic catastrophe of the unbridled natural world, the existential angst and weakness of the purposeless
  • individual—is clearly worse.
  • a title: 12 Rules for Life: An Antidote to Chaos. Why did that one rise up above all others? First and foremost, because of its simplicity. It indicates clearly that people need ordering principles, and that chaos otherwise beckons.
  • We require rules, standards, values—alone and together. We’re pack animals, beasts of burden. We must bear a load, to justify our miserable existence. We require routine and tradition. That’s order. Order can become excessive, and that’s not good, but chaos can swamp us, so we drown—and that is also not good. We need to stay on the straight and narrow path.
  • I hope that these rules and their accompanying essays will help people understand what they already know: that the soul of the individual eternally hungers for the heroism of genuine Being, and that the willingness to take on that responsibility is identical to the decision to live a meaningful life.
  • RULE 1   STAND UP STRAIGHT WITH YOUR SHOULDERS BACK
  • Because territory matters, and because the best locales are always in short supply, territory-seeking among animals produces conflict. Conflict, in turn, produces another problem: how to win or lose without the disagreeing parties incurring too great a cost.
  • It’s winner-take-all in the lobster world, just as it is in human societies, where the top 1 percent have as much loot as the bottom 50 percent11—and where the richest eighty-five people have as much as the bottom three and a half billion.
  • This principle is sometimes known as Price’s law, after Derek J. de Solla Price,13 the researcher who discovered its application in science in 1963. It can be modelled using an approximately L-shaped graph, with number of people on the vertical axis, and productivity or resources on the horizontal.
  • Instead of undertaking the computationally difficult task of identifying the best man, the females outsource the problem to the machine-like calculations of the dominance hierarchy. They let the males fight it out and peel their paramours from the top.
  • The dominant male, with his upright and confident posture, not only gets the prime real estate and easiest access to the best hunting grounds. He also gets all the girls. It is exponentially more worthwhile to be successful, if you are a lobster, and male.
  • dominance hierarchies have been an essentially permanent feature of the environment to which all complex life has adapted. A third of a billion years ago, brains and nervous systems were comparatively simple. Nonetheless, they already had the structure and neurochemistry necessary to process information about status and society. The importance of this fact can hardly be overstated.
  • evolution works, in large part, through variation and natural selection. Variation exists for many reasons, including gene-shuffling (to put it simply) and random mutation. Individuals vary within a species for such reasons. Nature chooses from among them, across time. That theory, as stated, appears to account for the continual alteration of life-forms over the eons.
  • But there’s an additional question lurking under the surface: what exactly is the “nature” in “natural selection”? What exactly is “the environment” to which animals adapt?
  • Nature “selects.” The idea of selects contains implicitly nested within it the idea of fitness. It is “fitness” that is “selected.” Fitness, roughly speaking, is the probability that a given organism will leave offspring (will propagate its genes through time). The “fit” in “fitness” is therefore the matching of organismal attribute to environmental demand.
  • But nature, the selecting agent, is not a static selector—not in any simple sense.
  • As the environment supporting a species transforms and changes, the features that make a given individual successful in surviving and reproducing also transform and change. Thus, the theory of natural selection does not posit creatures matching themselves ever more precisely to a template specified by the world. It is more that creatures are in a dance with nature, albeit one that is deadly.
  • Nature is not simply dynamic, either. Some things change quickly, but they are nested within other things that change less quickly (music
  • It’s chaos, within order, within chaos, within higher order. The order that is most real is the order that is most unchanging—and that is not necessarily the order that is most easily seen. The leaf, when perceived, might blind the observer to the tree. The tree can blind him to the forest.
  • It is also a mistake to conceptualize nature romantically.
  • Unfortunately, “the environment” is also elephantiasis and guinea worms (don’t ask), anopheles mosquitoes and malaria, starvation-level droughts, AIDS and the Black Plague.
  • It is because of the existence of such things, of course, that we attempt to modify our surroundings, protecting our children, building cities and transportation systems and growing food and generating power.
  • this brings us to a third erroneous concept: that nature is something strictly segregated from the cultural constructs that have emerged within it.
  • It does not matter whether that feature is physical and biological, or social and cultural. All that matters, from a Darwinian perspective, is permanence—and the dominance hierarchy, however social or cultural it might appear, has been around for some half a billion years.
  • The dominance hierarchy is not capitalism. It’s not communism, either, for that matter. It’s not the military-industrial complex. It’s not the patriarchy—that disposable, malleable, arbitrary cultural artefact. It’s not even a human creation; not in the most profound sense. It is instead a near-eternal aspect of the environment, and much of what is blamed on these more ephemeral manifestations is a consequence of its unchanging existence.
  • We were struggling for position before we had skin, or hands, or lungs, or bones. There is little more natural than culture. Dominance hierarchies are older than trees.
  • The part of our brain that keeps track of our position in the dominance hierarchy is therefore exceptionally ancient and fundamental.17 It is a master control system, modulating our perceptions, values, emotions, thoughts and actions. It powerfully affects every aspect of our Being, conscious and unconscious alike.
  • The ancient part of your brain specialized for assessing dominance watches how you are treated by other people. On that evidence, it renders a determination of your value and assigns you a status. If you are judged by your peers as of little worth, the counter restricts serotonin availability. That makes you much more physically and psychologically reactive to any circumstance or event that might produce emotion, particularly if it is negative. You need that reactivity. Emergencies are common at the bottom, and you must be ready to survive. Unfortunately, that physical hyper-response, that constant alertness, burns up a lot of precious energy and physical resources.
  • It will leave you far more likely to live, or die, carelessly, for a rare opportunity at pleasure, when it manifests itself. The physical demands of emergency preparedness will wear you down in every way.21
  • If you have a high status, on the other hand, the counter’s cold, pre-reptilian mechanics assume that your niche is secure, productive
  • You can delay gratification, without forgoing it forever. You can afford to be a reliable and thoughtful citizen.
  • Sometimes, however, the counter mechanism can go wrong. Erratic habits of sleeping and eating can interfere with its function. Uncertainty can throw it for a loop. The body, with its various parts,
  • needs
  • to function like a well-rehearsed orchestra. Every system must play its role properly, and at exactly the right time, or noise and chaos ensue. It is for this reason that routine is so necessary. The acts of life we repeat every day need to be automatized. They must be turned into stable and reliable habits, so they lose their complexity and gain predictability and simplicity.
  • It is for such reasons that I always ask my clinical clients first about sleep. Do they wake up in the morning at approximately the time the typical person wakes up, and at the same time every day?
  • The next thing I ask about is breakfast. I counsel my clients to eat a fat and protein-heavy breakfast as soon as possible after they awaken (no simple carbohydrates, no sugars,
  • I have had many clients whose anxiety was reduced to subclinical levels merely because they started to sleep on a predictable schedule and eat breakfast.
  • Other bad habits can also interfere with the counter’s accuracy.
  • There are many systems of interaction between brain, body and social world that can get caught in positive feedback loops. Depressed people, for example, can start feeling useless and burdensome, as well as grief-stricken and pained. This makes them withdraw from contact with friends and family. Then the withdrawal makes them more lonesome and isolated, and more likely to feel useless and burdensome. Then they withdraw more. In this manner, depression spirals and amplifies.
  • If someone is badly hurt at some point in life—traumatized—the dominance counter can transform in a manner that makes additional hurt more rather than less likely. This often happens in the case of people, now adults, who were viciously bullied during childhood or adolescence. They become anxious and easily upset. They shield themselves with a defensive crouch, and avoid the direct eye contact interpretable as a dominance challenge.
  • With their capacity for aggression strait-jacketed within a too-narrow morality, those who are only or merely compassionate and self-sacrificing (and naïve and exploitable) cannot call forth the genuinely righteous and appropriately self-protective anger necessary to defend themselves. If you can bite, you generally don’t have to. When skillfully integrated, the ability to respond with aggression and violence decreases rather than increases the probability that actual aggression will become necessary.
  • Naive, harmless people usually guide their perceptions and actions with a few simple axioms: people are basically good; no one really wants to hurt anyone else; the threat (and, certainly, the use) of force, physical or otherwise, is wrong. These axioms collapse, or worse, in the presence of
  • individuals who are genuinely malevolent.27
  • I have had clients who were terrified into literally years of daily hysterical convulsions by the sheer look of malevolence on their attackers’ faces. Such individuals typically come from hyper-sheltered families, where nothing
  • terrible is allowed to exist, and everything is fairyland wonderful (or else).
  • When the wakening occurs—when once-naïve people recognize in themselves the seeds of evil and monstrosity, and see themselves as dangerous (at least potentially)— their fear decreases. They develop more self-respect. Then, perhaps, they begin to resist oppression. They see that they have the ability to withstand, because they are terrible too. They see they can and must stand up, because they begin to understand how genuinely monstrous they will become, otherwise,
  • There is very little difference between the capacity for mayhem and destruction, integrated, and strength of character. This is one of the most difficult lessons of life.
  • even if you came by your poor posture honestly—even if you were unpopular or bullied at home or in grade school28—it’s not necessarily appropriate now. Circumstances change. If you slump around, with the same bearing that characterizes a defeated lobster, people will assign you a lower status, and the old counter that you share with crustaceans, sitting at the very base of your brain, will assign you a low dominance number.
  • the other, far more optimistic lesson of Price’s law and the Pareto distribution: those who start to have will probably get more.
  • Some of these upwardly moving loops can occur in your own private, subjective space.
  • If you are asked to move the muscles one by one into a position that looks happy, you will report feeling happier. Emotion is partly bodily expression, and can be amplified (or dampened) by that expression.29
  • To stand up straight with your shoulders back is to accept the terrible responsibility of life, with eyes wide open.
  • It means deciding to voluntarily transform the chaos of potential into the realities of habitable order. It means adopting the burden of self-conscious vulnerability, and accepting the end of the unconscious paradise of childhood, where finitude and mortality are only dimly comprehended. It means willingly undertaking the sacrifices necessary to generate a productive and meaningful reality (it means acting to please God, in the ancient language).
  • So, attend carefully to your posture. Quit drooping and hunching around. Speak your mind. Put your desires forward, as if you had a right to them—at least the same right as others. Walk tall and gaze forthrightly ahead. Dare to be dangerous. Encourage the serotonin to flow plentifully through the neural pathways desperate for its calming influence.
  • Thus emboldened, you will embark on the voyage of your life, let your light shine, so to speak, on the heavenly hill, and pursue your rightful destiny. Then the meaning of your life may be sufficient to keep the corrupting influence of mortal despair at bay. Then you may be able to accept the terrible burden of the World, and find joy.
  • RULE 2   TREAT YOURSELF LIKE SOMEONE YOU ARE RESPONSIBLE FOR HELPING
  • People are better at filling and properly administering prescription medication to their pets than to themselves. That
  • It is difficult to conclude anything from this set of facts except that people appear to love their dogs, cats, ferrets and birds (and maybe even their lizards) more than themselves. How horrible is that? How much shame must exist, for something like that to be true? What could it be about people that makes them prefer their pets to themselves?
  • To understand Genesis 1, the Priestly story, with its insistence on speech as the fundamental creative force, it is first necessary to review a few fundamental, ancient assumptions (these are markedly different in type and intent from the assumptions of science, which are, historically speaking, quite novel).
  • those who existed during the distant time in which the foundational epics of our culture emerged were much more concerned with the actions that dictated survival (and with interpreting the world in a manner commensurate with that goal) than with anything approximating what we now understand as objective truth.
  • Before the dawn of the scientific worldview, reality was construed differently. Being was understood as a place of action, not a place of things.31 It was understood as something more akin to story or drama. That story or drama was lived, subjective experience, as it manifested itself moment to moment in the consciousness of every living person.
  • subjective pain. That’s something so real no argument can stand against it. Everyone acts as if their pain is real—ultimately, finally real. Pain matters, more than matter matters. It is for this reason, I believe, that so many of the world’s traditions regard the suffering attendant upon existence as the irreducible truth of Being.
  • In any case, that which we subjectively experience can be likened much more to a novel or a movie than to a scientific description of physical reality.
  • The Domain, Not of Matter, but of What Matters
  • the world of experience has primal constituents, as well. These are the necessary elements whose interactions define drama and fiction. One of these is chaos. Another is order. The third (as there are three) is the process that mediates between the two, which appears identical to what modern people call consciousness.
  • Chaos is the domain of ignorance itself. It’s unexplored territory. Chaos is what extends, eternally and without limit, beyond the boundaries of all states, all ideas, and all disciplines. It’s the foreigner, the stranger, the member of another gang, the rustle in the bushes in the night-time,
  • It is, in short, all those things and situations we neither know nor understand.
  • Chaos is also the formless potential from which the God of Genesis 1 called forth order using language at the beginning of time. It’s the same potential from which we, made in that Image, call forth the novel and ever-changing moments of our lives. And Chaos is freedom, dreadful freedom, too.
  • Order, by contrast, is explored territory. That’s the hundreds-of-millions-of-years-old hierarchy of place, position and authority. That’s the structure of society. It’s the structure provided by biology, too—particularly insofar as you are adapted, as you are, to the structure of society. Order is tribe, religion, hearth, home and country.
  • Order is the public façade we’re called upon to wear, the politeness of a gathering of civilized strangers, and the thin ice on which we all skate. Order is the place where the behavior of the world matches our expectations and our desires; the place where all things turn out the way we want them to.
  • But order is sometimes tyranny and stultification, as well, when the demand for certainty and uniformity and purity becomes too one-sided.
  • In order, we’re able to think about things in the long term. There, things work, and we’re stable, calm and competent. We seldom leave places we
  • understand—geographical or conceptual—for that reason, and we certainly do not like it when we are compelled to or when it happens accidentally.
  • When the same person betrays you, sells you out, you move from the daytime world of clarity and light to the dark underworld of chaos, confusion and despair. That’s the same move you make, and the same place you visit, when the company you work for starts to fail and your job is placed in doubt.
  • Before the Twin Towers fell—that was order. Chaos manifested itself afterward. Everyone felt it. The very air became uncertain. What exactly was it that fell? Wrong question. What exactly remained standing? That was the issue at hand.
  • Chaos is the deep ocean bottom to which Pinocchio voyaged to rescue his father from Monstro, whale and fire-breathing dragon. That journey into darkness and rescue is the most difficult thing a puppet must do, if he wants to be real; if he wants to extract himself from the temptations of deceit and acting and victimization and impulsive pleasure and totalitarian subjugation; if he wants to take his place as a genuine Being in the world.
  • Chaos is the new place and time that emerges when tragedy strikes suddenly, or malevolence reveals its paralyzing visage, even in the confines of your own home. Something unexpected or undesired can always make its appearance, when a plan is being laid out, regardless of how familiar the circumstances.
  • Our brains respond instantly when chaos appears, with simple, hyper-fast circuits maintained from the ancient days, when our ancestors dwelled in trees, and snakes struck in a flash.32 After that nigh-instantaneous, deeply reflexive bodily response comes the later-evolving, more complex but slower responses of emotions—and, after that, comes thinking, of the higher order, which can extend over seconds, minutes or years. All that response is instinctive, in some sense—but the faster the response, the more instinctive.
  • Things or objects are part of the objective world. They’re inanimate; spiritless. They’re dead. This is not true of chaos and order. Those are perceived, experienced and understood (to the degree that they are understood at all) as personalities—and that is just as true of the perceptions, experiences and understanding of modern people as their ancient forebears. It’s just that moderners don’t notice.
  • Perception of things as entities with personality also occurs before perception of things as things. This is particularly true of the action of others,34 living others, but we also see the non-living “objective world” as animated, with purpose and intent.
  • This is because of the operation of what psychologists have called “the hyperactive agency detector” within us.35 We evolved, over millennia, within intensely social circumstances. This means that the most significant elements of our environment of origin were personalities, not things, objects or situations.
  • The personalities we have evolved to perceive have been around, in predictable form, and in typical, hierarchical configurations, forever, for all intents and purposes. They have been…
  • the category of “parent” and/or “child” has been around for 200 million years. That’s longer than birds have existed. That’s longer than flowers have grown. It’s not a billion years, but it’s still a very long time. It’s plenty long enough for male and female and parent and child to serve as vital and fundamental parts of the environment to which we have adapted. This means that male and female and parent and child are…
  • Our brains are deeply social. Other creatures (particularly, other humans) were crucially important to us as we lived, mated and evolved. Those creatures were…
  • From a Darwinian perspective, nature—reality itself; the environment, itself—is what selects. The environment cannot be defined in any more fundamental manner. It is not mere inert matter. Reality itself is whatever we contend with when we are striving to survive and reproduce. A…
  • as our brain capacity increased and we developed curiosity to spare, we became increasingly aware of and curious about the nature of the world—what we eventually conceptualized as the objective…
  • “outside” is not merely unexplored physical territory. Outside is outside of what we currently understand—and understanding is dealing with and coping with…
  • when we first began to perceive the unknown, chaotic, non-animal world, we used categories that had originally evolved to represent the pre-human animal social world. Our minds are far older than mere…
  • Our most…
  • category—as old, in some sense, as the sexual act itself—appears to be that of sex, male and female. We appear to have taken that primordial knowledge of structured, creative opposition and…
  • Order, the known, appears symbolically associated with masculinity (as illustrated in the aforementioned yang of the Taoist yin-yang symbol). This is perhaps because the primary…
  • Chaos—the unknown—is symbolically associated with the feminine. This is partly because all the things we have come to know were born, originally, of the unknown, just as all beings we encounter were born of mothers. Chaos is mater, origin, source, mother; materia, the substance from which all things are made.
  • In its positive guise, chaos is possibility itself, the source of ideas, the mysterious realm of gestation and birth. As a negative force, it’s the impenetrable darkness of a cave and the accident by the side of the road.
  • Chaos, the eternal feminine, is also the crushing force of sexual selection.
  • Most men do not meet female human standards. It is for this reason that women on dating sites rate 85 percent of men as below average in attractiveness.40
  • Women’s proclivity to say no, more than any other force, has shaped our evolution into the creative, industrious, upright, large-brained (competitive, aggressive, domineering) creatures that we are.42 It is Nature as Woman who says, “Well, bucko, you’re good enough for a friend, but my experience of you so far has not indicated the suitability of your genetic material for continued propagation.”
  • Many things begin to fall into place when you begin to consciously understand the world in this manner. It’s as if the knowledge of your body and soul falls into alignment with the knowledge of your intellect.
  • And there’s more: such knowledge is proscriptive, as well as descriptive. This is the kind of knowing what that helps you know how. This is the kind of is from which you can derive an ought. The Taoist juxtaposition of yin and yang, for example, doesn’t simply portray chaos and order as the fundamental elements of Being—it also tells you how to act.
  • The Way, the Taoist path of life, is represented by (or exists on) the border between the twin serpents. The Way is the path of proper Being. It’s the same Way as that referred to by Christ in John 14:6: I am the way, and the truth and the life. The same idea is expressed in Matthew 7:14: Because strait is the gate, and narrow is the way, which leadeth unto life, and few there be that find it.
  • We eternally inhabit order, surrounded by chaos. We eternally occupy known territory, surrounded by the unknown. We experience meaningful engagement when we mediate appropriately between them. We are adapted, in the deepest Darwinian sense, not to the world of objects, but to the meta-realities of order and chaos, yang and yin. Chaos and order make up the eternal, transcendent environment of the living.
  • To straddle that fundamental duality is to be balanced: to have one foot firmly planted in order and security, and the other in chaos, possibility, growth and adventure.
  • Chaos and order are fundamental elements because every lived situation (even every conceivable lived situation) is made up of both.
  • you need to place one foot in what you have mastered and understood and the other in what you are currently exploring and mastering. Then you have positioned yourself where the terror of existence is under control and you are secure, but where you are also alert and engaged. That is where there is something new to master and some way that you can be improved. That is where meaning is to be found.
  • The serpent in Eden therefore means the same thing as the black dot in the yin side of the Taoist yin/yang symbol of totality—that is, the possibility of the unknown and revolutionary suddenly manifesting itself where everything appears calm.
  • The outside, chaos, always sneaks into the inside, because nothing can be completely walled off from the rest of reality. So even the ultimate in safe spaces inevitably harbours a snake.
  • We have seen the enemy, after all, and he is us. The snake inhabits each of our souls.
  • The worst of all possible snakes is the eternal human proclivity for evil. The worst of all possible snakes is psychological, spiritual, personal, internal. No walls, however tall, will keep that out. Even if the fortress were thick enough, in principle, to keep everything bad whatsoever outside, it would immediately appear again within.
  • I have learned that these old stories contain nothing superfluous. Anything accidental—anything that does not serve the plot—has long been forgotten in the telling. As the Russian playwright Anton Chekhov advised, “If there is a rifle hanging on the wall in act one, it must be fired in the next act. Otherwise it has no
  • business being there.”50
  • Eve immediately shares the fruit with Adam. That makes him self-conscious. Little has changed. Women have been making men self-conscious since the beginning of time. They do this primarily by rejecting them—but they also do it by shaming them, if men do not take responsibility. Since women bear the primary burden of reproduction, it’s no wonder. It is very hard to see how it could be otherwise. But the capacity of women to shame men and render them self-conscious is still a primal force of nature.
  • What does it mean to know yourself naked
  • Naked means vulnerable and easily damaged. Naked means subject to judgment for beauty and health. Naked means unprotected and unarmed in the jungle of nature and man. This is why Adam and Eve became ashamed, immediately after their eyes were opened. They could see—and what they first saw was themselves.
  • In their vulnerability, now fully realized, they felt unworthy to stand before God.
  • Beauty shames the ugly. Strength shames the weak. Death shames the living—and the Ideal shames us all.
  • He tells the woman that she will now bring forth children in sorrow, and desire an unworthy, sometimes resentful man, who will in consequence lord her biological fate over her, permanently. What might this mean? It could just mean that God is a patriarchal tyrant, as politically motivated interpretations of the ancient story insist. I think it’s merely descriptive.
  • women pay a high price for pregnancy and child-rearing, particularly in the early stages, and that one of the inevitable consequences is increased dependence upon the sometimes unreliable and always problematic good graces of men.
  • then God banishes the first man and the first woman from Paradise, out of infancy, out of the unconscious animal world, into the horrors of history itself. And then He puts cherubim and a flaming sword at the gate of Eden, just to stop them from eating the Fruit of the Tree of Life.
  • Perhaps Heaven is something you must build, and immortality something you must earn.
  • so we return to our original query: Why would someone buy prescription medication for his dog, and then so carefully administer it, when he would not do the same for himself?
  • Why should anyone take care of anything as naked, ugly, ashamed, frightened, worthless, cowardly, resentful, defensive and accusatory as a descendant of Adam? Even if that thing, that being, is himself?
  • We know how we are naked, and how that nakedness can be exploited—and that means we know how others are naked, and how they can be exploited. We can terrify other people, consciously. We can hurt and humiliate them for faults we understand only too well. We can torture them—literally—slowly, artfully and terribly. That’s far more than predation. That’s a qualitative shift in understanding. That’s a cataclysm as large as the development of self-consciousness itself. That’s the entry of the knowledge of Good and Evil into the world.
  • Only man could conceive of the rack, the iron maiden and the thumbscrew. Only man will inflict suffering for the sake of suffering. That is the best definition of evil I have been able to formulate.
  • with this realization we have well-nigh full legitimization of the idea, very unpopular in modern intellectual circles, of Original Sin.
  • Human beings have a great capacity for wrongdoing. It’s an attribute that is unique in the world of life. We can and do make things worse, voluntarily, with full knowledge of what we are doing (as well as accidentally, and carelessly, and in a manner that is willfully blind). Given that terrible capacity, that proclivity for malevolent actions, is it any wonder we have a hard time taking care of ourselves, or others—or even that we doubt the value of the entire human enterprise?
  • The juxtaposition of Genesis 1 with Genesis 2 & 3 (the latter two chapters outlining the fall of man, describing why our lot is so tragedy-ridden and ethically torturous) produces a narrative sequence almost unbearable in its profundity. The moral of Genesis 1 is that Being brought into existence through true speech is Good.
  • The original Man and Woman, existing in unbroken unity with their Creator, did not appear conscious (and certainly not self-conscious). Their eyes were not open. But, in their perfection, they were also less, not more, than their post-Fall counterparts. Their goodness was something bestowed, rather than deserved or earned.
  • Maybe, even in some cosmic sense (assuming that consciousness itself is a phenomenon of cosmic significance), free choice matters.
  • here’s a proposition: perhaps it is not simply the emergence of self-consciousness and the rise of our moral knowledge of Death and the Fall that besets us and makes us doubt our own worth. Perhaps it is instead our unwillingness—reflected in Adam’s shamed hiding—to walk with God, despite our fragility and propensity for evil.
  • The entire Bible is structured so that everything after the Fall—the history of Israel, the prophets, the coming of Christ—is presented as a remedy for that Fall, a way out of evil. The beginning of conscious history, the rise of the state and all its pathologies of pride and rigidity, the emergence of great moral figures who try to set things right, culminating in the Messiah Himself—that is all part of humanity’s attempt, God willing, to set itself right. And what would that mean?
  • And this is an amazing thing: the answer is already implicit in Genesis 1: to embody the Image of God—to speak out of chaos the Being that is Good—but to do so consciously, of our own free choice.
  • Back is the way forward—as T. S. Eliot so rightly insisted
  • We shall not cease from exploration And the end of all our exploring Will be to arrive where we started And know the place for the first time.
  • If we wish to take care of ourselves properly, we would have to respect ourselves—but we don’t, because we are—not least in our own eyes—fallen creatures.
  • If we lived in Truth; if we spoke the Truth—then we could walk with God once again, and respect ourselves, and others, and the world. Then we might treat ourselves like people we cared for.
  • We might strive to set the world straight. We might orient it toward Heaven, where we would want people we cared for to dwell, instead of Hell, where our resentment and hatred would eternally sentence everyone.
  • Then, the primary moral issue confronting society was control of violent, impulsive selfishness and the mindless greed and brutality that accompanies it.
  • It is easy to believe that people are arrogant, and egotistical, and always looking out for themselves. The cynicism that makes that opinion a universal truism is widespread and fashionable.
  • But such an orientation to the world is not at all characteristic of many people. They have the opposite problem: they shoulder intolerable burdens of self-disgust, self-contempt, shame and self-consciousness. Thus, instead of narcissistically inflating their own importance, they don’t value themselves at all, and they don’t take care of themselves with attention and skill.
  • Christ’s archetypal death exists as an example of how to accept finitude, betrayal and tyranny heroically—how to walk with God despite the tragedy of self-conscious knowledge—and not as a directive to victimize ourselves in the service of others.
  • To sacrifice ourselves to God (to the highest good, if you like) does not mean to suffer silently and willingly when some person or organization demands more from us, consistently, than is offered in return. That means we are supporting tyranny, and allowing ourselves to be treated like slaves.
  • I learned two very important lessons from Carl Jung, the famous Swiss depth psychologist, about “doing unto others as you would have them do unto you” or “loving your neighbour as yourself.”
  • The first lesson was that neither of these statements has anything to do with being nice. The second was that both are equations, rather than injunctions.
  • If I am someone’s friend, family member, or lover, then I am morally obliged to bargain as hard on my own behalf as they are on theirs.
  • there is little difference between standing up and speaking for yourself, when you are being bullied or otherwise tormented and enslaved, and standing up and speaking for someone else.
  • you do not simply belong to yourself. You are not simply your own possession to torture and mistreat. This is partly because your Being is inexorably tied up with that of others, and your mistreatment of yourself can have catastrophic consequences for others.
  • metaphorically speaking, there is also this: you have a spark of the divine in you, which belongs not to you, but to God. We are, after all—according to Genesis—made in His image.
  • We can make order from chaos—and vice versa—in our way, with our words. So, we may not exactly be God, but we’re not exactly nothing, either.
  • In my own periods of darkness, in the underworld of the soul, I find myself frequently overcome and amazed by the ability of people to befriend each other, to love their intimate partners and parents and children, and to do what they must do to keep the machinery of the world running.
  • It is this sympathy that should be the proper medicament for self-conscious self-contempt, which has its justification, but is only half the full and proper story. Hatred for self and mankind must be balanced with gratefulness for tradition and the state and astonishment at what normal, everyday people accomplish
  • You have some vital role to play in the unfolding destiny of the world. You are, therefore, morally obliged to take care of yourself.
  • To treat yourself as if you were someone you are responsible for helping is, instead, to consider what would be truly good for you. This is not “what you want.” It is also not “what would make you happy.”
  • You must help a child become a virtuous, responsible, awake being, capable of full reciprocity—able to take care of himself and others, and to thrive while doing so. Why would you think it acceptable to do anything less for yourself?
  • You need to know who you are, so that you understand your armament and bolster yourself in respect to your limitations. You need to know where you are going, so that you can limit the extent of chaos in your life, restructure order, and bring the divine force of Hope to bear on the world.
  • You need to determine how to act toward yourself so that you are most likely to become and to stay a good person.
  • Don’t underestimate the power of vision and direction. These are irresistible forces, able to transform what might appear to be unconquerable obstacles into traversable pathways and expanding opportunities.
  • Once having understood Hell, researched it, so to speak—particularly your
  • own individual Hell—you could decide against going there or creating that.
  • You could, in fact, devote your life to this. That would give you a Meaning, with a capital M. That would justify your miserable existence.
  • That would atone for your sinful nature, and replace your shame and self-consciousness with the natural pride and forthright confidence of someone who has learned once again to walk with God in the Garden.
  • RULE 3   MAKE FRIENDS WITH PEOPLE WHO WANT THE BEST FOR YOU
  • It would be more romantic, I suppose, to suggest that we would have all jumped at the chance for something more productive, bored out of our skulls as we were. But it’s not true. We were all too prematurely cynical and world-weary and leery of responsibility to stick to the debating clubs and Air Cadets and school sports that the adults around us tried to organize. Doing anything wasn’t cool.
  • When you move, everything is up in the air, at least for a while. It’s stressful, but in the chaos there are new possibilities. People, including you, can’t hem you in with their old notions. You get shaken out of your ruts. You can make new, better ruts, with people aiming at better things. I thought this was just a natural development. I thought that every person who moved would have—and want—the same phoenix-like experience.
  • What was it that made Chris and Carl and Ed unable (or, worse, perhaps, unwilling) to move or to change their friendships and improve the circumstances of their lives? Was it inevitable—a consequence of their own limitations, nascent illnesses and traumas of the past?
  • Why did he—like his cousin, like my other friends—continually choose people who, and places that, were not good for him?
  • perhaps, they don’t want the trouble of better. Freud called this a “repetition compulsion.” He thought of it as an unconscious drive to repeat the horrors of the past
  • People create their worlds with the tools they have directly at hand. Faulty tools produce faulty results. Repeated use of the same faulty tools produces the same faulty results.
  • It is in this manner that those who fail to learn from the past doom themselves to repeat it. It’s partly fate. It’s partly inability. It’s partly…unwillingness to learn? Refusal to learn? Motivated refusal to learn?
  • People choose friends who aren’t good for them for other reasons, too. Sometimes it’s because they want to rescue someone.
  • it is not easy to distinguish between someone truly wanting and needing help and someone who is merely exploiting a willing helper. The distinction is difficult even for the person who is wanting and needing and possibly exploiting.
  • When it’s not just naïveté, the attempt to rescue someone is often fuelled by vanity and narcissism.
  • But Christ himself, you might object, befriended tax-collectors and prostitutes. How dare I cast aspersions on the motives of those who are trying to help? But Christ was the archetypal perfect man. And you’re you.
  • How do you know that your attempts to pull someone up won’t instead bring them—or you—further down?
  • The same thing happens when well-meaning counsellors place a delinquent teen among comparatively civilized peers. The delinquency spreads, not the stability.65 Down is a lot easier than up.
  • maybe you’re saving someone because you want to convince yourself that the strength of your character is more than just a side effect of your luck and birthplace. Or maybe it’s because it’s easier to look virtuous when standing alongside someone utterly irresponsible.
  • Or maybe you have no plan, genuine or otherwise, to rescue anybody. You’re associating with people who are bad for you not because it’s better for anyone, but because it’s easier.
  • You know it. Your friends know it. You’re all bound by an implicit contract—one aimed at nihilism, and failure, and suffering of the stupidest sort.
  • Before you help someone, you should find out why that person is in trouble. You shouldn’t merely assume that he or she is a noble victim of unjust circumstances and exploitation. It’s the most unlikely explanation, not the most probable.
  • Besides, if you buy the story that everything terrible just happened on its own, with no personal responsibility on the part of the victim, you deny that person all agency in the past (and, by implication, in the present and future, as well).
  • It is far more likely that a given individual has just decided to reject the path upward, because of its difficulty. Perhaps that should even be your default assumption, when faced with such a situation.
  • failure is easy to understand. No explanation for its existence is required. In the same manner, fear, hatred, addiction, promiscuity, betrayal and deception require no explanation. It’s not the existence of vice, or the indulgence in it, that requires explanation. Vice is easy.
  • Failure is easy, too. It’s easier not to shoulder a burden. It’s easier not to think, and not to do, and not to care. It’s easier to put off until tomorrow what needs to be done today,
  • Success: that’s the mystery. Virtue: that’s what’s inexplicable. To fail, you merely have to cultivate a few bad habits. You just have to bide your time. And once someone has spent enough time cultivating bad habits and biding their time, they are much diminished.
  • I am not saying that there is no hope of redemption. But it is much harder to extract someone
  • from a chasm than to lift him from a ditch. And some chasms are very deep. And there’s not much left of the body at the bottom.
  • Carl Rogers, the famous humanistic psychologist, believed it was impossible to start a therapeutic relationship if the person seeking help did not want to improve.67 Rogers believed it was impossible to convince someone to change for the better. The
  • none of this is a justification for abandoning those in real need to pursue your narrow, blind ambition, in case it has to be said.
  • Here’s something to consider: If you have a friend whose friendship you wouldn’t recommend to your sister, or your father, or your son, why would you have such a friend for yourself?
  • You are not morally obliged to support someone who is making the world a worse place. Quite the opposite. You should choose people who want things to be better, not worse. It’s a good thing, not a selfish thing, to choose people who are good for you.
  • It is for this reason that every good example is a fateful challenge, and every hero, a judge. Michelangelo’s great perfect marble David cries out to its observer: “You could be more than you are.”
  • Don’t think that it is easier to surround yourself with good healthy people than with bad unhealthy people. It’s not. A good, healthy person is an ideal. It requires strength and daring to stand up near such a person.
  • RULE 4   COMPARE YOURSELF TO WHO YOU WERE YESTERDAY, NOT TO WHO SOMEONE ELSE IS TODAY
  • IT WAS EASIER FOR PEOPLE to be good at something when more of us lived in small, rural communities. Someone could be homecoming queen. Someone else could be spelling-bee champ, math whiz or basketball star. There were only one or two mechanics and a couple of teachers. In each of their domains, these local heroes had the opportunity to enjoy the serotonin-fuelled confidence of the victor.
  • Our hierarchies of accomplishment are now dizzyingly vertical.
  • No matter how good you are at something, or how you rank your accomplishments, there is someone out there who makes you look incompetent.
  • We are not equal in ability or outcome, and never will be. A very small number of people produce very much of everything.
  • People are unhappy at the bottom. They get sick there, and remain unknown and unloved. They waste their lives there. They die there. In consequence, the self-denigrating voice in the minds of people weaves a devastating tale. Life is a zero-sum game. Worthlessness is the default condition.
  • It is for such reasons that a whole generation of social psychologists recommended “positive illusions” as the only reliable route to mental health.69 Their credo? Let a lie be your umbrella. A more dismal, wretched, pessimistic philosophy can hardly be imagined:
  • Here is an alternative approach (and one that requires no illusions). If the cards are always stacked against you, perhaps the game you are playing is somehow rigged (perhaps by you, unbeknownst to yourself). If the internal voice makes you doubt the value of your endeavours—or your life, or life itself—perhaps you should stop listening.
  • There will always be people better than you—that’s a cliché of nihilism, like the phrase, In a million years, who’s going to know the difference? The proper response to that statement is not, Well, then, everything is meaningless. It’s, Any idiot can choose a frame of time within which nothing matters.
  • Standards of better or worse are not illusory or unnecessary. If you hadn’t decided that what you are doing right now was better than the alternatives, you wouldn’t be doing it. The idea of a value-free choice is a contradiction in terms. Value judgments are a precondition for action.
  • Furthermore, every activity, once chosen, comes with its own internal standards of accomplishment. If something can be done at all, it can be done better or worse. To do anything at all is therefore to play a game with a defined and valued end, which can always be reached more or less efficiently and elegantly.
  • We might start by considering the all-too-black-and-white words themselves: “success” or “failure.” You are either a success, a comprehensive, singular, over-all good thing, or its opposite, a failure, a comprehensive, singular, irredeemably bad thing.
  • There are vital degrees and gradations of value obliterated by this binary system, and the consequences are not good.
  • there is not just one game at which to succeed or fail. There are many games and, more specifically, many good games—
  • if changing games does not work, you can invent a new one. I
  • and athletic pursuits. You might consider judging your success across all the games you play.
  • When we are very young we are neither individual nor informed. We have not had the time nor gained the wisdom to develop our own standards. In consequence, we must compare ourselves to others, because standards are necessary.
  • As we mature we become, by contrast, increasingly individual and unique. The conditions of our lives become more and more personal and less and less comparable with those of others. Symbolically speaking, this means we must leave the house ruled by our father, and confront the chaos of our individual Being.
  • We must then rediscover the values of our culture—veiled from us by our ignorance, hidden in the dusty treasure-trove of the past—rescue them, and integrate them into our own lives. This is what gives existence its full and necessary meaning.
  • What is it that you actually love? What is it that you genuinely want? Before you can articulate your own standards of value, you must see yourself as a stranger—and then you must get to know yourself. What
  • Dare to be truthful. Dare to articulate yourself, and express (or at least become aware of) what would really justify your life.
  • Consult your resentment. It’s a revelatory emotion, for all its pathology. It’s part of an evil triad: arrogance, deceit, and resentment. Nothing causes more harm than this underworld Trinity. But resentment always means one of two things. Either the resentful person is immature, in which case he or she should shut up, quit whining, and get on with it, or there is tyranny afoot—in which case the person subjugated has a moral obligation to speak up.
  • Be cautious when you’re comparing yourself to others. You’re a singular being, once you’re an adult. You have your own particular, specific problems—financial, intimate, psychological, and otherwise.
  • Those are embedded in the unique broader context of your existence. Your career or job works for you in a personal manner, or it does not, and it does so in a unique interplay with the other specifics of your life.
  • We must see, but to see, we must aim, so we are always aiming. Our minds are built on the hunting-and-gathering platforms of our bodies. To hunt is to specify a target, track it, and throw at it.
  • We live within a framework that defines the present as eternally lacking and the future as eternally better. If we did not see things this way, we would not act at all. We wouldn’t even be able to see, because to see we must focus, and to focus we must pick one thing above all else on which to focus.
  • The disadvantage to all this foresight and creativity is chronic unease and discomfort. Because we always contrast what is with what could be, we have to aim at what could be.
  • The present is eternally flawed. But where you start might not be as important as the direction you are heading. Perhaps happiness is always to be found in the journey uphill, and not in the fleeting sense of satisfaction awaiting at the next peak.
  • Called upon properly, the internal critic will suggest something to set in order, which you could set in order, which you would set in order—voluntarily, without resentment, even with pleasure.
  • “Excuse me,” you might say to yourself, without irony or sarcasm. “I’m trying to reduce some of the unnecessary suffering around here. I could use some help.” Keep the derision at bay. “I’m wondering if there is anything that you would be willing to do? I’d be very grateful for your service.” Ask honestly and with humility. That’s no simple matter.
8More

What's in the $1.9 trillion rescue plan for small businesses - CNN - 0 views

  • The $1.9 trillion American Rescue Plan Act that President Joe Biden will sign into law on Friday contains a number of provisions intended to help small businesses, especially restaurants.
  • It authorizes another $7.25 billion for the Paycheck Protection Program, which offers forgivable loans to small businesses and other organizations hurt by the pandemic. The loans will only be forgiven, however, if at least 60% of the money is used to support payroll expenses and the remainder goes to mortgage interest, rent, utilities, personal protective equipment or certain other business expenses.
  • Despite the added funds, however, the legislation doesn't actually extend the program, which is currently set to expire on March 31.
  • ...5 more annotations...
  • That means banks must decide how much longer they want to accept new applications, since it now takes 24 to 48 hours for a bank to hear back from the SBA on whether a submitted loan application has been approved.
  • Bank of America, for instance, stopped accepting new applications on Tuesday. "As the largest lender in the program since it began, we have 30,000 applications in process and want to allow enough time to complete the work and get each client's application through the SBA process by March 31," a bank spokesman said in an email to CNN Business.
  • The American Rescue Plan includes nearly $29 billion to create a grant program that provides direct relief to restaurants.
  • Another $15 billion will be added to the Shuttered Venue Operators Grants program, created by the previous economic aid package. The grants are intended to help those who run museum, theater, concert and other venues that had to shut down due to Covid restrictions. The bill also allows such operators to apply for PPP loans in addition to these grants.
  • To help the SBA administer all the new programs that have fallen under its purview as a result of the pandemic, the bill allocates another $1.325 billion to its budget.
25More

Trump's Pardons: The List - The New York Times - 0 views

  • With hours to go before President Trump left office, the White House released a list early Wednesday of 73 people he had pardoned and 70 others whose sentences he had commuted.
  • On the list were at least two people who had worked for Mr. Trump: Stephen K. Bannon, his former chief strategist, and Elliott Broidy, a former top fund-raiser. Both received full pardons.
  • The rapper Lil Wayne, born Dwayne Michael Carter Jr., received a full pardon after pleading guilty to possession of a firearm and ammunition by a felon in December. Mr. Trump also granted a commutation to another rapper, Kodak Black, whose legal name is Bill Kapri (though he was born Dieuson Octave). In 2019, he was sentenced to nearly four years in prison for lying on background paperwork while attempting to buy guns.
  • ...22 more annotations...
  • Mr. Trump issued full pardons to Nicholas Slatton and three other former U.S. service members who were convicted on charges related to the killing of Iraqi civilians while they were working as security contractors for Blackwater, a private company, in 2007.
  • Mr. Manafort, 71, had been sentenced in 2019 to seven and a half years in prison for his role in a decade-long, multimillion-dollar financial fraud scheme for his work in the former Soviet Union. He was released early from prison in May as a result of the coronavirus pandemic and given home confinement. Mr. Trump had repeatedly expressed sympathy for Mr. Manafort, describing him as a brave man who had been mistreated by the special counsel’s office.
  • Mr. Stone, a longtime friend and adviser of Mr. Trump, was sentenced in February 2020 to more than three years in prison in a politically fraught case that put the president at odds with his attorney general. Mr. Stone was convicted of seven felony charges, including lying under oath to a congressional committee and threatening a witness whose testimony would have exposed those lies.
  • Mr. Kushner, 66, the father-in-law of the president’s older daughter, Ivanka Trump, pleaded guilty in 2004 to 16 counts of tax evasion, a single count of retaliating against a federal witness and one of lying to the Federal Election Commission. He served two years in prison before being released in 2006.
  • Michael T. Flynn, a former national security adviser who twice pleaded guilty to lying to the F.B.I. about his conversations with a Russian diplomat, and whose prosecution Attorney General William P. Barr tried to shut down, was the only White House official to be convicted as part of the Trump-Russia investigation.
  • The Supreme Court has ruled that the Constitution gives presidents unlimited authority to grant pardons, which excuse or forgive a federal crime. A commutation, by contrast, makes a punishment milder without wiping out the underlying conviction.
  • Joe Arpaio, an anti-immigration crusader who enjoyed calling himself “America’s toughest sheriff,” was the first pardon of Mr. Trump’s presidency.
  • Conrad M. Black, a former press baron and friend of Mr. Trump’s, was granted a full pardon 12 years after his sentencing for fraud and obstruction of justice.
  • Former Gov. Rod R. Blagojevich of Illinois was sentenced in 2011 to 14 years in prison for trying to sell or trade to the highest bidder the Senate seat that Mr. Obama vacated after he was elected president.
  • Susan B. Anthony, the women’s suffragist, was arrested in Rochester, N.Y., in 1872 for voting illegally and was fined $100. Mr. Trump pardoned her on Aug. 18, the 100th anniversary of the ratification of 19th Amendment, which extended voting rights to women.
  • Edward J. DeBartolo Jr., a former owner of the San Francisco 49ers, pleaded guilty in 1998 to concealing an extortion plot. Mr. DeBartolo was prosecuted after he gave Edwin W. Edwards, the influential former governor of Louisiana, $400,000 to secure a riverboat gambling license for his gambling consortium.
  • Alice Marie Johnson was serving life in a federal prison for a nonviolent drug conviction before her case was brought to Mr. Trump’s attention by the reality television star Kim Kardashian West.
  • Jack Johnson, the first Black heavyweight boxing champion, was tarnished by a racially tainted criminal conviction in 1913 — for transporting a white woman across state lines — that haunted him well after his death in 1946. Mr. Trump pardoned him on May 24, 2018.
  • Dinesh D’Souza received a presidential pardon after pleading guilty to making illegal campaign contributions in 2014. Mr. D’Souza, a filmmaker and author whose subjects often dabble in conspiracy theories, had long blamed his conviction on his political opposition to Mr. Obama.
  • Zay Jeffries, a metal scientist whose contributions to the Manhattan Project and whose development of armor-piercing artillery shells helped the Allies win World War II, was granted a posthumous pardon on Oct. 10, 2019. Jeffries was found guilty in 1948 of an antitrust violation related to his work and was fined $2,500.
  • Ten years ago, Bernard B. Kerik, a former New York City police commissioner, was sentenced to four years in prison after pleading guilty to eight felony charges, including tax fraud and lying to White House officials.
  • I. Lewis Libby Jr., known as Scooter, was Vice President Dick Cheney’s top adviser before Mr. Libby was convicted in 2007 of four felony counts, including perjury and obstruction of justice, in connection with the disclosure of the identity of a C.I.A. officer, Valerie Plame.
  • Mr. Trump’s decision to clear three members of the armed services who had been accused or convicted of war crimes signaled that the president intended to use his power as the ultimate arbiter of military justice.
  • Michael R. Milken was the billionaire “junk bond king” and a well-known financier on Wall Street in the 1980s. In 1990, he pleaded guilty to securities fraud and conspiracy charges and was sentenced to 10 years in prison, though his sentence was later reduced to two. He also agreed to pay $600 million in fines and penalties.
  • Dwight Hammond and his son, Steven Hammond, were Oregon cattle ranchers who had been serving five-year sentences for arson on federal land. Their cases inspired an antigovernment group’s weekslong standoff at the Malheur National Wildlife Refuge in Oregon in 2016 and brought widespread attention to anger over federal land management in the Western United States.
  • David H. Safavian, the top federal procurement official under President George W. Bush, was sentenced in 2009 to a year in prison for covering up his ties to Jack Abramoff, the disgraced lobbyist whose corruption became a symbol of the excesses of Washington influence peddling. Mr. Safavian was convicted of obstruction of justice and making false statements.
  • Angela Stanton — an author, television personality and motivational speaker — served six months of home confinement in 2007 for her role in a stolen-vehicle ring. Her book “Life of a Real Housewife” explores her difficult upbringing and her encounters with reality TV stars.
34More

The nation's public health agencies are ailing when they're needed most - The Washingto... - 0 views

  • At the very moment the United States needed its public health infrastructure the most, many local health departments had all but crumbled, proving ill-equipped to carry out basic functions let alone serve as the last line of defense against the most acute threat to the nation’s health in generations.
  • Epidemiologists, academics and local health officials across the country say the nation’s public health system is one of many weaknesses that continue to leave the United States poorly prepared to handle the coronavirus pandemic
  • That system lacks financial resources. It is losing staff by the day.
  • ...31 more annotations...
  • Even before the pandemic struck, local public health agencies had lost almost a quarter of their overall workforce since 2008 — a reduction of almost 60,000 workers
  • The agencies’ main source of federal funding — the Centers for Disease Control and Prevention’s emergency preparedness budget — had been cut 30 percent since 2003. The Trump administration had proposed slicing even deeper.
  • According to David Himmelstein of the CUNY School of Public Health, global consensus is that, at minimum, 6 percent of a nation’s health spending should be devoted to public health efforts. The United States, he said, has never spent more than half that much.
  • the problems have been left to fester.
  • Delaware County, Pa., a heavily populated Philadelphia suburb, did not even have a public health department when the pandemic struck and had to rely on a neighbor to mount a response.
  • With plunging tax receipts straining local government budgets, public health agencies confront the possibility of further cuts in an economy gutted by the coronavirus. It is happening at a time when health departments are being asked to do more than ever.
  • While the country spends roughly $3.6 trillion every year on health, less than 3 percent of that spending goes to public health and prevention
  • “That’s the way we run much of our public health activity for local health departments. You apply to the CDC, which is the major conduit for federal funding to state and local health departments,” Himmelstein said. “You apply to them for funding for particular functions, and if you don’t get the grant, you don’t have the funding for that.”
  • Compared with Canada, the United Kingdom and northern European countries, the United States — with a less generous social safety net and no universal health care — is investing less in a system that its people rely on more.
  • Himmelstein said that the United States has never placed much emphasis on public health spending but that the investment began to decline even further in the early 2000s. The Great Recession fueled further cuts.
  • Plus, the U.S. public health system relies heavily on federal grants.
  • “Why an ongoing government function should depend on episodic grants rather than consistent funding, I don’t know,” he added. “That would be like seeing that the military is going to apply for a grant for its regular ongoing activities.”
  • Many public health officials say a lack of a national message and approach to the pandemic has undermined their credibility and opened them up to criticism.
  • Few places were less prepared for covid-19’s arrival than Delaware County, Pa., where Republican leaders had decided they did not need a public health department at all
  • “I think the general population didn’t really realize we didn’t have a health department. They just kind of assumed that was one of those government agencies we had,” Taylor said. “Then the pandemic hit, and everyone was like, ‘Wait, hold on — we don’t have a health department? Why don’t we have a health department?’ ”
  • Taylor and other elected officials worked out a deal with neighboring Chester County in which Delaware County paid affluent Chester County’s health department to handle coronavirus operations for both counties for now.
  • One reason health departments are so often neglected is their work focuses on prevention — of outbreaks, sexually transmitted diseases, smoking-related illnesses. Local health departments describe a frustrating cycle: The more successful they are, the less visible problems are and the less funding they receive. Often, that sets the stage for problems to explode again — as infectious diseases often do.
  • It has taken years for many agencies to rebuild budgets and staffing from deep cuts made during the last recessio
  • During the past decade, many local health departments have seen annual rounds of cuts, punctuated with one-time infusions of money following crises such as outbreaks of Zika, Ebola, measles and hepatitis. The problem with that cycle of feast or famine funding is that the short-term money quickly dries up and does nothing to address long-term preparedness.
  • “It’s a silly strategic approach when you think about what’s needed to protect us long term,”
  • She compared the country’s public health system to a house with deep cracks in the foundation. The emergency surges of funding are superficial repairs that leave those cracks unaddressed.
  • “We came into this pandemic at a severe deficit and are still without a strategic goal to build back that infrastructure. We need to learn from our mistakes,”
  • With the economy tanking, the tax bases for cities and counties have shrunken dramatically — payroll taxes, sales taxes, city taxes. Many departments have started cutting staff. Federal grants are no sure thing.
  • 80 percent of counties have reported their budget was affected in the current fiscal year because of the crisis. Prospects are even more dire for future budget periods, when the full impact of reduced tax revenue will become evident.
  • Christine Hahn, medical director for Idaho’s division of public health and a 25-year public health veteran, has seen the state make progress in coronavirus testing and awareness. But like so many public health officials across the country taking local steps to deal with what has become a national problem, she is limited by how much government leaders say she can do and by what citizens are willing to do.
  • “I’ve been through SARS, the 2009 pandemic, the anthrax attacks, and of course I’m in rural Idaho, not New York City and California,” Hahn said. “But I will say this is way beyond anything I’ve ever experienced as far as stress, workload, complexity, frustration, media and public interest, individual citizens really feeling very strongly about what we’re doing and not doing.”
  • At the same time, many countries that invest more in public health infrastructure also provide universal medical coverage that enables them to provide many common public health services as part of their main health-care-delivery system.
  • “People locally are looking to see what’s happening in other states, and we’re constantly having to talk about that and address that,”
  • “I’m mindful of the credibility of our messaging as people say, ‘What about what they’re doing in this place? Why are we not doing what they’re doing?’ ”
  • Many health experts worry the challenges will multiply in the fall with the arrival of flu season.
  • “The unfolding tragedy here is we need people to see local public health officials as heroes in the same way that we laud heart surgeons and emergency room doctors,” Westergaard, the Wisconsin epidemiologist, said. “The work keeps getting higher, and they’re falling behind — and not feeling appreciated by their communities.”
7More

Undocumented Venezuelans Given Protected Status In United States : NPR - 0 views

  • The Biden administration said Monday that it will allow many Venezuelans who are already in the country illegally to remain because of the humanitarian and economic crisis in the socialist South American nation that is an adversary of the U.S.
  • Homeland Security Secretary Alejandro Mayorkas granted Temporary Protected Status to an estimated 320,000 Venezuelans.
  • Venezuela has fallen into a life-threatening economic and humanitarian crises that have caused more than 5 million Venezuelans to flee in search of food, medicine and shelter.
  • ...4 more annotations...
  • The decision represents a dramatic shift in policy from the previous administration, which withstood bipartisan calls to grant the protections to Venezuelans.
  • Fernand Amandi, a Miami-based Democratic political strategist and pollster, said granting TPS to Venezuelans could open up a new relationship for Biden with an electorate that largely supported Trump in 2020.
  • Temporary Protected Status is granted to those from countries ravaged by natural disasters or war, allowing them to live and work in the United States until conditions improve back home.
  • "Venezuela is a failed state in every single measure," he said. "There's simply not enough food to go around. Hospitals had collapsed before the pandemic. You have a repressive dictatorship and extraordinary levels of violence in the street. You have pro-government militias attacking protesters. It's just inhumane to send people back into those conditions."
31More

AOC and Rashida Tlaib's Public Banking Act, explained - Vox - 0 views

  • A public option, but for banking. That’s what Reps. Rashida Tlaib and Alexandria Ocasio-Cortez are proposing in a new bill unveiled on Friday.
  • would foster the creation of public banks across the country by providing them a pathway to getting started, establishing an infrastructure for liquidity and credit facilities for them via the Federal Reserve, and setting up federal guidelines for them to be regulated.
  • at some point it’s just hitting a wall where it doesn’t carry them along and they’re looking for options,” said Tlaib, who represents Michigan’s 13th Congressional District, the third-poorest congressional district in the country. “So I’m putting this on the table as an option.”
  • ...28 more annotations...
  • The proposal lands in the midst of the Covid-19 pandemic, which has shed light on many inefficiencies in the American system, including banking. Take the Paycheck Protection Program, for example: It used the regular banking system as an intermediary, which ultimately meant that bigger businesses and those with preexisting relationships with those banks were prioritized over others.
  • guarantee a more equitable recovery by providing an alternative to Wall Street banks for state and local governments, businesses, and ordinary people,
  • The public banking bill also does double duty as a climate bill: It would prohibit public banks from investing in or doing business with the fossil fuel industry.
  • “Public banks empower states and municipalities to establish new channels of public investment to help solve systemic crises.”
  • But, he said, this proposal is particularly comprehensive and supportive.
  • If Democrats keep control of the House come 2021 and manage to flip the Senate and win the White House, they’ll be able to take some big legislative swings, including and perhaps especially on issues related to the economy.
  • which theoretically would be more motivated to do public good and invest in their communities than private institutions, which are out for profit.
  • To be clear, the Public Banking Act isn’t creating a federal public bank.
  • encourage and enable the creation of public banks across the US. It provides legitimacy to those who are pushing for more public banking, and it also includes regulators as key stakeholders who can support and provide guidance for how those banks should operate.
  • though different public banks would likely have different areas of emphasis.
  • They could also facilitate easier access to funds for state and local governments from the federal government or Federal Reserve.
  • “It’s basically a way to finance state and local investment that doesn’t go through Wall Street and doesn’t leave the community and turn into a windfall for shareholders,
  • Public banks need the FDIC to provide assurances that it will recognize them in accordance with the bond rating of the city or state they represent.
  • Tlaib recalled hearing from her constituents when the $1,200 coronavirus stimulus checks went out this spring — people waiting days and weeks for direct deposits, or getting a check in the mail only to lose a substantial portion of it cashing it at the store down the street.
  • The Public Banking Act allows the Federal Reserve to charter and grant membership to public banks and creates a grant program for the Treasury secretary to provide seed money for public banks to be formed, capitalized, and developed.
  • “This is more about community development.”
  • McConnell said the FDIC issuing guidance that it recognizes the city’s — and the state’s — public banks as an AAA rating would send a clear direction to the state financial regulators that the public bank is considered low risk.
  • The bill would also provide a road map for the FDIC, which insures bank deposits of up to $250,000, to insure deposits for public banks, so people feel assured they won’t lose all their money by choosing to open an account with their state bank instead of, say, Wells Fargo.
  • the Office of the Comptroller of the Currency (OCC) has historically been charged with chartering national banks in the US, not the Fed, meaning this is a fairly novel idea.
  • It prohibits the Fed and Treasury from considering the financial health of an entity that controls or owns a bank in grant-making decisions.
  • So here is the thing about private companies, including, yes, banks: The point of them is to make money, and that drives their decisions. It’s not necessarily evil (though sometimes it kind of is), but it’s just how they work.
  • The idea behind public banking isn’t that Goldman Sachs, Wells Fargo, and Morgan Stanley go away; it’s that they have to compete with a government-owned entity — and one that’s a little fairer and more ethical in how it does business.
  • Public banks, as imagined in the Tlaib/Ocasio-Cortez proposal, would provide loans to small businesses and governments with lower interest rates and lower fees.
  • Student loans are facilitated directly with BND, but other loans, called participation loans, go through a local financial institution — often with BND support.
  • According to a study on public banks, BND had some $2 billion in active participation loans in 2014. BND can grant larger loans at a lower risk, which fosters a healthy financial ecosystem populated by a cluster of small North Dakota banks.
  • Democrats have a lot of ideas, and if they take power come January 2021, there’s a lot they can do.
  • The Public Banking Act is meant to complement ideas such as the ABC Act and postal banking. And, of course, it’s linked to the Green New Deal, not only because it would bar public banks from financing things that hurt the environment, but also because the idea is that public banks would play a major role in financing Green New Deal and climate-friendly projects.
  • If former Vice President Joe Biden wins the White House and Democrats control both the House and the Senate come 2021, the talk around these ideas becomes a lot more serious.
10More

​​​​​​​Your Home Belongs to Renovation TV - The Atlantic - 0 views

  • HGTV is regularly a top-five cable channel—and its growing popularity has coincided with a huge increase in actual renovations. In the 1990s, American homeowners spent an average of more than $90 billion annually on remodeling their homes. By 2020, it was more than $400 billion
  • For homeowners, pressure to keep up with the Joneses has reached a logical extreme. Everywhere you look, there are new reasons to be unhappy with your house, and new trends you can follow to fix it.
  • Annetta Grant, a professor at Bucknell University who studies the home-renovation market, recently co-authored an ethnography on how home-reno media has changed people’s relationship to their home. She and her fellow researcher, Jay Handelman, conducted extensive interviews with 17 people in the process of renovating their home, attended a consumer-renovation expo, interviewed renovation-service providers, and consumed dozens of hours and hundreds of pages of home-reno media.
  • ...7 more annotations...
  • The primary finding was that home-renovation media seems to make people feel uneasy in their own home. In academic terms, the phenomenon is known as dysplacement, or a sense that our long-held understanding of what our home means to us is out of sync with what changing market forces have decided a home should be. In layman’s terms, it’s the unsettling feeling that the home you’ve made for yourself is no longer a good one, and that other people think less of you for it.
  • People are highly sensitive to feeling out-of-sorts in their home, Grant told me. This is one of the reasons that moving and unpacking are so stressful, and that accumulating unnecessary clutter feels so bothersome.
  • Americans have long understood successful home ownership and homemaking as indicative of personal success and character. Beginning in the postwar era, “that was largely achieved by customizing your home to the personality that you wanted to portray,”
  • Even in the tract-home developments of mid-century suburbs, the insides of houses tended to be idiosyncratic, with liberal use of color and texture and pattern—on the walls, the floors, the furniture. Some of those choices were the result of trends, of course, but there was plenty of variety within those parameters, and people tended to pick things they liked and stick with them
  • Now, however, “personalization is being ripped out of people’s homes” in favor of market-pleasing standardization,
  • , Grant said that people expressed embarrassment at having friends over to their outdated home, so much so that they’d avoid hosting their book club or planning parties—precisely the kinds of happy occasions that your home is supposed to be for.
  • The goal of this media apparatus, Grant said, isn’t to provide knowledge and inspiration for people improving the country’s aging housing stock but to keep people engaged in a process of constant updating—discarding old furniture and fixtures and appliances and buying new ones in much the way many people now cycle through an endless stream of fast-fashion pieces, trying to live up to standards that they can never quite pin down, and therefore never quite satisfy
51More

Opinion | The Pandemic Probably Started in a Lab. These 5 Key Points Explain Why. - The... - 0 views

  • a growing volume of evidence — gleaned from public records released under the Freedom of Information Act, digital sleuthing through online databases, scientific papers analyzing the virus and its spread, and leaks from within the U.S. government — suggests that the pandemic most likely occurred because a virus escaped from a research lab in Wuhan, China.
  • If so, it would be the most costly accident in the history of science.
  • The SARS-like virus that caused the pandemic emerged in Wuhan, the city where the world’s foremost research lab for SARS-like viruses is located.
  • ...48 more annotations...
  • Dr. Shi’s group was fascinated by how coronaviruses jump from species to species. To find viruses, they took samples from bats and other animals, as well as from sick people living near animals carrying these viruses or associated with the wildlife trade. Much of this work was conducted in partnership with the EcoHealth Alliance, a U.S.-based scientific organization that, since 2002, has been awarded over $80 million in federal funding to research the risks of emerging infectious diseases.
  • Their research showed that the viruses most similar to SARS‑CoV‑2, the virus that caused the pandemic, circulate in bats that live roughly 1,000 miles away from Wuhan. Scientists from Dr. Shi’s team traveled repeatedly to Yunnan province to collect these viruses and had expanded their search to Southeast Asia. Bats in other parts of China have not been found to carry viruses that are as closely related to SARS-CoV-2.
  • When the Covid-19 outbreak was detected, Dr. Shi initially wondered if the novel coronavirus had come from her laboratory, saying she had never expected such an outbreak to occur in Wuhan.
  • The SARS‑CoV‑2 virus is exceptionally contagious and can jump from species to species like wildfire. Yet it left no known trace of infection at its source or anywhere along what would have been a thousand-mile journey before emerging in Wuhan.
  • The year before the outbreak, the Wuhan institute, working with U.S. partners, had proposed creating viruses with SARS‑CoV‑2’s defining feature
  • The laboratory pursued risky research that resulted in viruses becoming more infectious: Coronaviruses were grown from samples from infected animals and genetically reconstructed and recombined to create new viruses unknown in nature. These new viruses were passed through cells from bats, pigs, primates and humans and were used to infect civets and humanized mice (mice modified with human genes). In essence, this process forced these viruses to adapt to new host species, and the viruses with mutations that allowed them to thrive emerged as victors.
  • Worse still, as the pandemic raged, their American collaborators failed to publicly reveal the existence of the Defuse proposal. The president of EcoHealth, Peter Daszak, recently admitted to Congress that he doesn’t know about virus samples collected by the Wuhan institute after 2015 and never asked the lab’s scientists if they had started the work described in Defuse.
  • By 2019, Dr. Shi’s group had published a database describing more than 22,000 collected wildlife samples. But external access was shut off in the fall of 2019, and the database was not shared with American collaborators even after the pandemic started, when such a rich virus collection would have been most useful in tracking the origin of SARS‑CoV‑2. It remains unclear whether the Wuhan institute possessed a precursor of the pandemic virus.
  • In 2021, The Intercept published a leaked 2018 grant proposal for a research project named Defuse, which had been written as a collaboration between EcoHealth, the Wuhan institute and Ralph Baric at the University of North Carolina, who had been on the cutting edge of coronavirus research for years. The proposal described plans to create viruses strikingly similar to SARS‑CoV‑2.
  • Coronaviruses bear their name because their surface is studded with protein spikes, like a spiky crown, which they use to enter animal cells. The Defuse project proposed to search for and create SARS-like viruses carrying spikes with a unique feature: a furin cleavage site — the same feature that enhances SARS‑CoV‑2’s infectiousness in humans, making it capable of causing a pandemic. Defuse was never funded by the United States.
  • owever, in his testimony on Monday, Dr. Fauci explained that the Wuhan institute would not need to rely on U.S. funding to pursue research independently.
  • While it’s possible that the furin cleavage site could have evolved naturally (as seen in some distantly related coronaviruses), out of the hundreds of SARS-like viruses cataloged by scientists, SARS‑CoV‑2 is the only one known to possess a furin cleavage site in its spike. And the genetic data suggest that the virus had only recently gained the furin cleavage site before it started the pandemic.
  • Ultimately, a never-before-seen SARS-like virus with a newly introduced furin cleavage site, matching the description in the Wuhan institute’s Defuse proposal, caused an outbreak in Wuhan less than two years after the proposal was drafted.
  • When the Wuhan scientists published their seminal paper about Covid-19 as the pandemic roared to life in 2020, they did not mention the virus’s furin cleavage site — a feature they should have been on the lookout for, according to their own grant proposal, and a feature quickly recognized by other scientists.
  • At the Wuhan Institute of Virology, a team of scientists had been hunting for SARS-like viruses for over a decade, led by Shi Zhengl
  • In May, citing failures in EcoHealth’s monitoring of risky experiments conducted at the Wuhan lab, the Biden administration suspended all federal funding for the organization and Dr. Daszak, and initiated proceedings to bar them from receiving future grants. In his testimony on Monday, Dr. Fauci said that he supported the decision to suspend and bar EcoHealth.
  • Separately, Dr. Baric described the competitive dynamic between his research group and the institute when he told Congress that the Wuhan scientists would probably not have shared their most interesting newly discovered viruses with him. Documents and email correspondence between the institute and Dr. Baric are still being withheld from the public while their release is fiercely contested in litigation.
  • In the end, American partners very likely knew of only a fraction of the research done in Wuhan. According to U.S. intelligence sources, some of the institute’s virus research was classified or conducted with or on behalf of the Chinese military.
  • In the congressional hearing on Monday, Dr. Fauci repeatedly acknowledged the lack of visibility into experiments conducted at the Wuhan institute, saying, “None of us can know everything that’s going on in China, or in Wuhan, or what have you. And that’s the reason why — I say today, and I’ve said at the T.I.,” referring to his transcribed interview with the subcommittee, “I keep an open mind as to what the origin is.”
  • The Wuhan lab pursued this type of work under low biosafety conditions that could not have contained an airborne virus as infectious as SARS‑CoV‑2.
  • Labs working with live viruses generally operate at one of four biosafety levels (known in ascending order of stringency as BSL-1, 2, 3 and 4) that describe the work practices that are considered sufficiently safe depending on the characteristics of each pathogen. The Wuhan institute’s scientists worked with SARS-like viruses under inappropriately low biosafety conditions.
  • ​​Biosafety levels are not internationally standardized, and some countries use more permissive protocols than others.
  • In one experiment, Dr. Shi’s group genetically engineered an unexpectedly deadly SARS-like virus (not closely related to SARS‑CoV‑2) that exhibited a 10,000-fold increase in the quantity of virus in the lungs and brains of humanized mice. Wuhan institute scientists handled these live viruses at low biosafety levels, including BSL-2.
  • Even the much more stringent containment at BSL-3 cannot fully prevent SARS‑CoV‑2 from escaping. Two years into the pandemic, the virus infected a scientist in a BSL-3 laboratory in Taiwan, which was, at the time, a zero-Covid country. The scientist had been vaccinated and was tested only after losing the sense of smell. By then, more than 100 close contacts had been exposed. Human error is a source of exposure even at the highest biosafety levels, and the risks are much greater for scientists working with infectious pathogens at low biosafety.
  • An early draft of the Defuse proposal stated that the Wuhan lab would do their virus work at BSL-2 to make it “highly cost-effective.” Dr. Baric added a note to the draft highlighting the importance of using BSL-3 to contain SARS-like viruses that could infect human cells, writing that “U.S. researchers will likely freak out.”
  • Years later, after SARS‑CoV‑2 had killed millions, Dr. Baric wrote to Dr. Daszak: “I have no doubt that they followed state determined rules and did the work under BSL-2. Yes China has the right to set their own policy. You believe this was appropriate containment if you want but don’t expect me to believe it. Moreover, don’t insult my intelligence by trying to feed me this load of BS.”
  • SARS‑CoV‑2 is a stealthy virus that transmits effectively through the air, causes a range of symptoms similar to those of other common respiratory diseases and can be spread by infected people before symptoms even appear. If the virus had escaped from a BSL-2 laboratory in 2019, the leak most likely would have gone undetected until too late.
  • One alarming detail — leaked to The Wall Street Journal and confirmed by current and former U.S. government officials — is that scientists on Dr. Shi’s team fell ill with Covid-like symptoms in the fall of 2019. One of the scientists had been named in the Defuse proposal as the person in charge of virus discovery work. The scientists denied having been sick.
  • The hypothesis that Covid-19 came from an animal at the Huanan Seafood Market in Wuhan is not supported by strong evidence.
  • In December 2019, Chinese investigators assumed the outbreak had started at a centrally located market frequented by thousands of visitors daily. This bias in their search for early cases meant that cases unlinked to or located far away from the market would very likely have been missed
  • To make things worse, the Chinese authorities blocked the reporting of early cases not linked to the market and, claiming biosafety precautions, ordered the destruction of patient samples on January 3, 2020, making it nearly impossible to see the complete picture of the earliest Covid-19 cases. Information about dozens of early cases from November and December 2019 remains inaccessible.
  • A pair of papers published in Science in 2022 made the best case for SARS‑CoV‑2 having emerged naturally from human-animal contact at the Wuhan market by focusing on a map of the early cases and asserting that the virus had jumped from animals into humans twice at the market in 2019
  • More recently, the two papers have been countered by other virologists and scientists who convincingly demonstrate that the available market evidence does not distinguish between a human superspreader event and a natural spillover at the market.
  • Furthermore, the existing genetic and early case data show that all known Covid-19 cases probably stem from a single introduction of SARS‑CoV‑2 into people, and the outbreak at the Wuhan market probably happened after the virus had already been circulating in humans.
  • Not a single infected animal has ever been confirmed at the market or in its supply chain. Without good evidence that the pandemic started at the Huanan Seafood Market, the fact that the virus emerged in Wuhan points squarely at its unique SARS-like virus laboratory.
  • With today’s technology, scientists can detect how respiratory viruses — including SARS, MERS and the flu — circulate in animals while making repeated attempts to jump across species. Thankfully, these variants usually fail to transmit well after crossing over to a new species and tend to die off after a small number of infections
  • investigators have not reported finding any animals infected with SARS‑CoV‑2 that had not been infected by humans. Yet, infected animal sources and other connective pieces of evidence were found for the earlier SARS and MERS outbreaks as quickly as within a few days, despite the less advanced viral forensic technologies of two decades ago.
  • Even though Wuhan is the home base of virus hunters with world-leading expertise in tracking novel SARS-like viruses, investigators have either failed to collect or report key evidence that would be expected if Covid-19 emerged from the wildlife trade. For example, investigators have not determined that the earliest known cases had exposure to intermediate host animals before falling ill.
  • No antibody evidence shows that animal traders in Wuhan are regularly exposed to SARS-like viruses, as would be expected in such situations.
  • In previous outbreaks of coronaviruses, scientists were able to demonstrate natural origin by collecting multiple pieces of evidence linking infected humans to infected animals
  • In contrast, virologists and other scientists agree that SARS‑CoV‑2 required little to no adaptation to spread rapidly in humans and other animals. The virus appears to have succeeded in causing a pandemic upon its only detected jump into humans.
  • it was a SARS-like coronavirus with a unique furin cleavage site that emerged in Wuhan, less than two years after scientists, sometimes working under inadequate biosafety conditions, proposed collecting and creating viruses of that same design.
  • a laboratory accident is the most parsimonious explanation of how the pandemic began.
  • Given what we now know, investigators should follow their strongest leads and subpoena all exchanges between the Wuhan scientists and their international partners, including unpublished research proposals, manuscripts, data and commercial orders. In particular, exchanges from 2018 and 2019 — the critical two years before the emergence of Covid-19 — are very likely to be illuminating (and require no cooperation from the Chinese government to acquire), yet they remain beyond the public’s view more than four years after the pandemic began.
  • it is undeniable that U.S. federal funding helped to build an unprecedented collection of SARS-like viruses at the Wuhan institute, as well as contributing to research that enhanced them.
  • Advocates and funders of the institute’s research, including Dr. Fauci, should cooperate with the investigation to help identify and close the loopholes that allowed such dangerous work to occur. The world must not continue to bear the intolerable risks of research with the potential to cause pandemics.
  • A successful investigation of the pandemic’s root cause would have the power to break a decades-long scientific impasse on pathogen research safety, determining how governments will spend billions of dollars to prevent future pandemics. A credible investigation would also deter future acts of negligence and deceit by demonstrating that it is indeed possible to be held accountable for causing a viral pandemic
  • Last but not least, people of all nations need to see their leaders — and especially, their scientists — heading the charge to find out what caused this world-shaking event. Restoring public trust in science and government leadership requires it.
18More

The Plight of the Overworked Nonprofit Employee - The Atlantic - 0 views

  • Many nonprofit organizations stare down a shared set of challenges: In a 2013 report, the Urban Institute surveyed over 4,000 nonprofits of a wide range of types and sizes across the continental U.S. It found that all kinds of nonprofits struggled with delays in payment for contracts, difficulty securing funding for the full cost of their services, and other financial issues.
  • Recent years have been especially hard for many nonprofits. Most have annual budgets of less than $1 million, and those budgets took a big hit from the recession, when federal, municipal, and philanthropic funding dried up. On top of that, because so many nonprofits depend on government money, policy changes can cause funding priorities to change, which in turn can put nonprofits in a bind.
  • The pressure from funders to tighten budgets and cut costs can produce what researchers call the “nonprofit starvation cycle.” The cycle starts with funders’ unrealistic expectations about the costs of running a nonprofit. In response, nonprofits try to spend less on overhead (like salaries) and under-report expenses to try to meet those unrealistic expectations. That response then reinforces the unrealistic expectations that began the cycle. In this light, it’s no surprise that so many nonprofits have come to rely on unpaid work.
  • ...15 more annotations...
  • Strangely, though nonprofits are increasingly expected to perform like businesses, they do not get the same leeway in funding that government-contracted businesses do. They don’t have nearly the bargaining power of big corporations, or the ability to raise costs for their products and services, because of tight controls on grant funding. “D.C. is full of millionaires who contract with government in the defense field, and they make a killing, and yet if you’re a nonprofit, chances are you aren’t getting the full amount of funding to cover the cost of the services required,” Iliff said. “Can you imagine Lockheed Martin or Boeing putting up with a government contract that didn’t allow for overhead?”
  • When faced with dwindling funding, one response would be to cut a program or reduce the number of people an organization serves. But nonprofit leaders have shown themselves very reluctant to do that. Instead, many meet financial challenges by squeezing more work out of their staffs without a proportional increase in their pay:
  • nonprofits like PIRG, for example, have a tradition of forcing employees to work long, unpaid hours—especially their youngest staff. “There’s a culture that says, ‘Young people are paying their dues. It’s okay for them to be paid for fewer hours than they’re actually working because it’s in the effort of helping them grow up and contribute to something greater than they are,’” Boris says.
  • These nonprofit employees are saying that their operations depend on large numbers of their lowest-paid staff working unpaid overtime hours. One way to get  to that point would be to face a series of choices between increased productivity on the one hand and reduced hours, increased pay, or more hiring on the other, and to choose more productivity every time. That some nonprofits have done this speaks to a culture that can put the needs of staff behind mission-driven ambitions.
  • In the 1970s, 62 percent of full-time, salaried workers qualified for mandatory overtime pay when they worked more than 40 hours in a week. Today, because the overtime rules have not had a major update since then (until this one), only 7 percent of workers are covered, whether they work in the nonprofit sector or elsewhere. In other words, U.S. organizations—nonprofit or otherwise—have been given the gift of a large pool of laborers who, as long as they clear a relatively low earnings threshold and do tasks that meet certain criteria, do not have to be paid overtime.
  • Unsurprisingly, many nonprofits have taken advantage of that pool of free work. (For-profit companies have too, but they also have the benefit of being more in control of their revenue streams.) B
  • “There is this feeling that the mission is so important that nothing should get in the way of it,”
  • “Too often, I have seen the passion for social change turned into a weapon against the very people who do much—if not most—of the hard work, and put in most of the hours,” Hastings recently wrote on her blog. “Because they are highly motivated by passion, the reasoning goes, they don’t need to be motivated by decent salaries or sustainable work hours or overtime pay.”
  • A 2011 survey of more than 2,000 nonprofit employees by Opportunity Knocks, a human-resources organization that specializes in nonprofits, in partnership with Jessica Word, an associate professor of public administration at the University of Nevada, Las Vegas, found that half of employees in the nonprofit sector may be burned out or in danger of burnout.
  • . “These are highly emotional and difficult jobs,” she said, adding, “These organizations often have very high rates of employee turnover, which results from a combination of burnout and low compensation.” Despite the dearth of research, Word’s findings don’t appear to be unusual: A more recent study of nonprofits in the U.S. and Canada found that turnover, one possible indicator of burnout, is higher in nonprofits than in the overall labor market.
  • for all their hours and emotional labor, nonprofit employees generally don’t make much money. A 2014 study by Third Sector New England, a resource center for nonprofits, found that 43 percent of nonprofit employees in New England were making less than $28,000 per year—far less than a living wage for families with children in most cities in the United States, and well below the national median income of between $40,000 and $50,000 per year.
  • Why would nonprofit workers be willing to stay in jobs where they are underpaid, or, in some cases, accept working conditions that violate the spirit of the labor laws that protect them? One plausible reason is that they are just as committed to the cause as their superiors
  • But it also might be that some nonprofits exploit gray areas in the law to cut costs. For instance, only workers who are labelled as managers are supposed to be exempt from overtime, but many employers stretch the definition of “manager” far beyond its original intent.
  • even regardless of these designations, the emotionally demanding work at many nonprofits is sometimes difficult to shoehorn into a tidy 40-hours-a-week schedule. Consider Elle Roberts, who was considered exempt from overtime restrictions and was told not to work more than 40 hours a week when, as a young college grad, she worked at a domestic-violence shelter in northwest Indiana. Doing everything from home visits to intake at the shelter, Roberts still ignored her employer’s dictates and regularly worked well more than 40 hours a week providing relief for women in crisis. Yet she was not paid for that extra time.
  • “The unspoken expectation is that you do whatever it takes to get whatever it is done for the people that you’re serving,” she says. “And anything less than that, you’re not quite doing enough.
13More

Congress Is Running Out of Time to Save Puerto Rico - The Atlantic - 0 views

  • A Commonwealth in Crisis
  • On Sunday, Puerto Rico will likely default again on some of its debts, which now total over $70 billion.
  • It is an entity that is often almost completely at the whim of Congress, the most dysfunctional body in national politics today.
  • ...10 more annotations...
  • . Its economy has not grown in over a decade, and there is a $28 billion gap in funding over the next five years alone.
  • Public hospitals and health-care facilities may close, exacerbating an ongoing crisis that already sees Puerto Rican patients receiving far worse health care than their mainland counterparts.
  • Puerto Rico has attempted to solve its debt issues by granting itself bankruptcy powers under rules that grant service providers in the states the ability to seek relief, but that decision remains under Supreme Court review and has been opposed in no uncertain terms by the federal government.
  • The plan also exempts Puerto Rican workers under 25 years old from the labor protections of a federally-mandated minimum wage and overtime regulation, with the goal of making Puerto Rico’s job market more competitive in comparison to its neighbors.
  • Representatives of those who hold the territory’s general obligation bonds oppose the bill as well, because it would allow Puerto Rico to restructure its debts and delay payment, as well as limit the ability of bondholders to sue if it defaults.
  • In 1984 Congress specifically carved Puerto Rico out from Chapter 9 municipal bankruptcy protections that it once had, offering no reason for singling out the territory.
  • Also in 1996, Congress passed legislation to phase out Section 936 of the federal tax code, a law that exempted U.S. industries from taxes on income in Puerto Rico. With no replacement plan to promote development or growth, Puerto Rico’s economy suffered.
  • The Medicaid underpayment deficit alone accounts for almost one-fifth of the current total deficit in the territory. A legislative medical-funding scheme could both help right the financial ship in Puerto Rico and help its crippled health-care system face the siege of Zika.
  • Granted, the government of Puerto Rico is not blameless in this saga. Its public-services monopolies are extraordinarily inefficient; the territorial government has not been good with spending, budgeting, or long-term fiscal planning; and the two major parties––for statehood and the status quo––have supported some congressional reforms with the goal of forcing the other side’s hand, instead of promoting good governance.
  • Congress’s dysfunctions might run so deep that they keep it from even addressing a humanitarian crisis in the country, a total failure of the body’s special duty towards Puerto Rico. Congress’s current plan might provide short-term relief, but it is a bit of a Hobson’s choice.
17More

5 Big Banks Expected to Plead Guilty to Felony Charges, but Punishments May Be Tempered... - 0 views

  • For most people, pleading guilty to a felony means they will very likely land in prison, lose their job and forfeit their right to vote.
  • But when five of the world’s biggest banks plead guilty to an array of antitrust and fraud charges as soon as next week, life will go on, probably without much of a hiccup.
  • Most if not all of the pleas are expected to come from the banks’ holding companies, the people said — a first for Wall Street giants that until now have had only subsidiaries or their biggest banking units plead guilty.
  • ...14 more annotations...
  • The Justice Department is also preparing to resolve accusations of foreign currency misconduct at UBS. As part of that deal, prosecutors are taking the rare step of tearing up a 2012 nonprosecution agreement with the bank over the manipulation of benchmark interest rates, the people said, citing the bank’s foreign currency misconduct as a violation of the earlier agreement.
  • The guilty pleas, scarlet letters affixed to banks of this size and significance, represent another prosecutorial milestone in a broader effort to crack down on financial misdeeds. Yet as much as prosecutors want to punish banks for misdeeds, they are also mindful that too harsh a penalty could imperil banks that are at the heart of the global economy, a balancing act that could produce pleas that are more symbolic than sweeping.
  • While the S.E.C.’s five commissioners have not yet voted on the requests for waivers, which would allow the banks to conduct business as usual despite being felons, the people briefed on the matter expected a majority of commissioners to grant them.
  • In reality, those accommodations render the plea deals, at least in part, an exercise in stagecraft. And while banks might prefer a deferred-prosecution agreement that suspends charges in exchange for fines and other concessions — or a nonprosecution deal like the one that UBS is on the verge of losing — the reputational blow of being a felon does not spell disaster.
  • The action against UBS underscores the threats that Justice Department officials issued in recent months about voiding past deals in the event of new misdeeds, a central tactic in a plan to address the cycle of corporate recidivism. Leslie Caldwell, the head of the Justice Department’s criminal division, recently remarked that she “will not hesitate to tear up a D.P.A. or N.P.A. and file criminal charges where such action is appropriate.”
  • The Justice Department negotiations coincide with the banks’ separate efforts to persuade the S.E.C. to issue waivers from automatic bans that occur when a company pleads guilty. If the waivers are not granted, a decision that the Justice Department does not control, the banks could face significant consequences.
  • For example, some banks may be seeking waivers to a ban on overseeing mutual funds, one of the people said. They are also requesting waivers to ensure they do not lose their special status as “well-known seasoned issuers,” which allows them to fast-track securities offerings. For some of the banks, there is also a concern that they will lose their “safe harbor” status for making forward-looking statements in securities documents.
  • it seemed probable that a majority of the S.E.C.’s commissioners would approve most of the waivers, which can be granted for a cause like the public good. Still, the agency’s two Democratic commissioners — Kara M. Stein and Luis A. Aguilar, who have denounced the S.E.C.’s use of waivers — might be more likely to balk.
  • Senator Elizabeth Warren, Democrat of Massachusetts, and other liberal politicians have criticized prosecutors for treating Wall Street with kid gloves. Banks and their lawyers, however, complain about huge penalties and guilty pleas.
  • lingering in the background is the case of Arthur Andersen, an accounting giant that imploded after being convicted in 2002 of criminal charges related to its work for Enron. After the firm’s collapse, and the later reversal of its conviction, prosecutors began to shift from indictments and guilty pleas to deferred-prosecution agreements. And in 2008, the Justice Department updated guidelines for prosecuting corporations, which have long included a requirement that prosecutors weigh collateral consequences like harm to shareholders and innocent employees.
  • “The collateral consequences consideration is designed to address the risk that a particular criminal charge might inflict disproportionate harm to shareholders, pension holders and employees who are not even alleged to be culpable or to have profited potentially from wrongdoing,” said Mark Filip, the Justice Department official who wrote the 2008 memo. “Arthur Andersen was ultimately never convicted of anything, but the mere act of indicting it destroyed one of the cornerstones of the Midwest’s economy.”
  • After years of deferred-prosecution agreements, the pendulum swung back in favor of guilty pleas in 2012
  • In pursuing cases last year against Credit Suisse and BNP Paribas, prosecutors confronted the popular belief that banks had grown so important to the economy that they could not be charged
  • Yet after prosecutors announced the deals, the banks’ chief executives promptly assured investors that the effect would be minimal.“Apart from the impact of the fine, BNP Paribas will once again post solid results this quarter,” BNP’s chief, Jean-Laurent Bonnafé, said.Brady Dougan, Credit Suisse’s chief at the time, said the deal would not cause “any material impact on our operational or business capabilities.”
39More

The Real Story of How America Became an Economic Superpower - The Atlantic - 0 views

  • a new history of the 20th century: the American century, which according to Tooze began not in 1945 but in 1916, the year U.S. output overtook that of the entire British empire.
  • The two books narrate the arc of American economic supremacy from its beginning to its apogee. It is both ominous and fitting that the second volume of the story was published in 2014, the year in which—at least by one economic measure—that supremacy came to an end.
  • “Britain has the earth, and Germany wants it.” Such was Woodrow Wilson’s analysis of the First World War in the summer of 1916,
  • ...36 more annotations...
  • what about the United States? Before the 1914 war, the great economic potential of the U.S. was suppressed by its ineffective political system, dysfunctional financial system, and uniquely violent racial and labor conflicts. “America was a byword for urban graft, mismanagement and greed-fuelled politics, as much as for growth, production, and profit,”
  • as World War I entered its third year—and the first year of Tooze’s story—the balance of power was visibly tilting from Europe to America. The belligerents could no longer sustain the costs of offensive war. Cut off from world trade, Germany hunkered into a defensive siege, concentrating its attacks on weak enemies like Romania. The Western allies, and especially Britain, outfitted their forces by placing larger and larger war orders with the United States
  • His Wilson is no dreamy idealist. The president’s animating idea was an American exceptionalism of a now-familiar but then-startling kind.
  • That staggering quantity of Allied purchases called forth something like a war mobilization in the United States. American factories switched from civilian to military production; American farmers planted food and fiber to feed and clothe the combatants of Europe
  • But unlike in 1940-41, the decision to commit so much to one side’s victory in a European war was not a political decision by the U.S. government. Quite the contrary: President Wilson wished to stay out of the war entirely. He famously preferred a “peace without victory.” The trouble was that by 1916, the U.S. commitment to Britain and France had grown—to borrow a phrase from the future—too big to fail.
  • His Republican opponents—men like Theodore Roosevelt, Henry Cabot Lodge, and Elihu Root—wished to see America take its place among the powers of the earth. They wanted a navy, an army, a central bank, and all the other instrumentalities of power possessed by Britain, France, and Germany. These political rivals are commonly derided as “isolationists” because they mistrusted the Wilson’s League of Nations project. That’s a big mistake. They doubted the League because they feared it would encroach on American sovereignty.
  • Grant presents this story as a laissez-faire triumph. Wartime inflation was halted. Borrowing and spending gave way to saving and investing. Recovery then occurred naturally, without any need for government stimulus. “The hero of my narrative is the price mechanism, Adam Smith’s invisible hand,
  • It was Wilson who wished to remain aloof from the Entente, who feared that too close an association with Britain and France would limit American options.
  • Wilson was guided by a different vision: Rather than join the struggle of imperial rivalries, the United States could use its emerging power to suppress those rivalries altogether. Wilson was the first American statesman to perceive that the United States had grown, in Tooze’s words, into “a power unlike any other. It had emerged, quite suddenly, as a novel kind of ‘super-state,’ exercising a veto over the financial and security concerns of the other major states of the world.”
  • Wilson hoped to deploy this emerging super-power to enforce an enduring peace. His own mistakes and those of his successors doomed the project,
  • What went wrong? “When all is said and done,” Tooze writes, “the answer must be sought in the failure of the United States to cooperate with the efforts of the French, British, Germans and the Japanese [leaders of the early 1920s] to stabilize a viable world economy and to establish new institutions of collective security. … Given the violence they had already experienced and the risk of even greater future devastation, France, Germany, Japan, and Britain could all see this. But what was no less obvious was that only the US could anchor such a new order.”
  • And that was what Americans of the 1920s and 1930s declined to do—because doing so implied too much change at home for them: “At the hub of the rapidly evolving, American-centered world system there was a polity wedded to a conservative vision of its own future.”
  • The Forgotten Depression is a polemic embedded within a narrative, an argument against the Obama stimulus joined to an account of the depression of 1920-21. As Grant correctly observes, that depression was one of the sharpest and most painful in American history.
  • Then, after 18 months of extremely hard times, the economy lurched into recovery. By 1923, the U.S. had returned to full employment.
  • “By the end of 1916, American investors had wagered two billion dollars on an Entente victory,” computes Tooze (relative to America’s estimated GDP of $50 billion in 1916, the equivalent of $560 billion in today’s money).
  • the central assumption of his version of events is the same one captured in Rothbard’s title half a century ago: that America’s economic history constitutes a story unto itself.
  • Americans, meanwhile, were preoccupied with the problem of German recovery. How could Germany achieve political stability if it had to pay so much to France and Belgium? The Americans pressed the French to relent when it came to Germany, but insisted that their own claims be paid in full by both France and Britain.
  • Germany, for its part, could only pay if it could export, and especially to the world’s biggest and richest consumer market, the United States. The depression of 1920 killed those export hopes. Most immediately, the economic crisis sliced American consumer demand precisely when Europe needed it most.
  • But the gravest harm done by the depression to postwar recovery lasted long past 1921. To appreciate that, you have to understand the reasons why U.S. monetary authorities plunged the country into depression in 1920.
  • Monetary authorities, worried that inflation would revive and accelerate, made the fateful decision to slam the credit brakes, hard. Unlike the 1918 recession, that of 1920 was deliberately engineered. There was nothing invisible about it. Nor did the depression “cure itself.” U.S. officials cut interest rates and relaxed credit, and the economy predictably recovered
  • But 1920-21 was an inflation-stopper with a difference. In post-World War II America, anti-inflationists have been content to stop prices from rising. In 1920-21, monetary authorities actually sought to drive prices back to their pre-war levels
  • James Grant hails this accomplishment. Adam Tooze forces us to reckon with its consequences for the rest of the planet.
  • When the U.S. opted for massive deflation, it thrust upon every country that wished to return to the gold standard (and what respectable country would not?) an agonizing dilemma. Return to gold at 1913 values, and you would have to match U.S. deflation with an even steeper deflation of your own, accepting increased unemployment along the way. Alternatively, you could re-peg your currency to gold at a diminished rate. But that amounted to an admission that your money had permanently lost value—and that your own people, who had trusted their government with loans in local money, would receive a weaker return on their bonds than American creditors who had lent in dollars.
  • Britain chose the former course; pretty much everybody else chose the latter.
  • The consequences of these choices fill much of the second half of The Deluge. For Europeans, they were uniformly grim, and worse.
  • But one important effect ultimately rebounded on Americans. America’s determination to restore a dollar “as good as gold” not only imposed terrible hardship on war-ravaged Europe, it also threatened to flood American markets with low-cost European imports. The flip side of the Lost Generation enjoying cheap European travel with their strong dollars was German steelmakers and shipyards underpricing their American competitors with weak marks.
  • American leaders of the 1920s weren’t willing to accept this outcome. In 1921 and 1923, they raised tariffs, terminating a brief experiment with freer trade undertaken after the election of 1912. The world owed the United States billions of dollars, but the world was going to have to find another way of earning that money than selling goods to the United States.
  • Between 1924 and 1930, world financial flows could be simplified into a daisy chain of debt. Germans borrowed from Americans, and used the proceeds to pay reparations to the Belgians and French. The French and Belgians, in turn, repaid war debts to the British and Americans. The British then used their French and Italian debt payments to repay the United States, who set the whole crazy contraption in motion again. Everybody could see the system was crazy. Only the United States could fix it. It never did.
  • The reckless desperation of Hitler’s war provides context for the horrific crimes of his regime. Hitler’s empire could not feed itself, so his invasion plan for the Soviet Union contemplated the death by starvation of 20 to 30 million Soviet urban dwellers after the invaders stole all foodstuffs for their own use. Germany lacked workers, so it plundered the labor of its conquered peoples. By 1944, foreigners constituted 20 percent of the German workforce and 33 percent of armaments workers
  • “If man accumulates enough combustible material, God will provide the spark.” So it happened in 1929. The Deluge that had inundated the rest of the developed world roared back upon the United States.
  • From the start, the United States was Hitler’s ultimate target. “In seeking to explain the urgency of Hitler’s aggression, historians have underestimated his acute awareness of the threat posed to Germany, along with the rest of the European powers, by the emergence of the United States as the dominant global superpower,” Tooze writes. “The originality of National Socialism was that, rather than meekly accepting a place for Germany within a global economic order dominated by the affluent English-speaking countries, Hitler sought to mobilize the pent-up frustrations of his population to mount an epic challenge to this order.”
  • Germany was a weaker and poorer country in 1939 than it had been in 1914. Compared with Britain, let alone the United States, it lacked the basic elements of modernity: There were just 486,000 automobiles in Germany in 1932, and one-quarter of all Germans still worked as farmers as of 1925. Yet this backward land, with an income per capita comparable to contemporary “South Africa, Iran and Tunisia,” wagered on a second world war even more audacious than the first.
  • That way was found: more debt, especially more German debt. The 1923 hyper-inflation that wiped out Germany’s savers also tidied up the country’s balance sheet. Post-inflation Germany looked like a very creditworthy borrower.
  • On paper, the Nazi empire of 1942 represented a substantial economic bloc. But pillage and slavery are not workable bases for an industrial economy. Under German rule, the output of conquered Europe collapsed. The Hitlerian vision of a united German-led Eurasia equaling the Anglo-American bloc proved a crazed and genocidal fantasy.
  • The foundation of this order was America’s rise to unique economic predominance a century ago. That predominance is now coming to an end as China does what the Soviet Union and Imperial Germany never could: rise toward economic parity with the United States.
  • t is coming, and when it does, the fundamental basis of world-power politics over the past 100 years will have been removed. Just how big and dangerous a change that will be is the deepest theme of Adam Tooze's profound and brilliant grand narrative
1 - 20 of 525 Next › Last »
Showing 20 items per page