Skip to main content

Home/ History Readings/ Group items tagged social insurance

Rss Feed Group items tagged

Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

Hospital Prices Are Arbitrary. Just Look at the Kingsburys' $100,000 Bill. - WSJ - 0 views

  • The costs, which overwhelmed the Kingsburys and ruined their finances, didn’t have to be so large. A Wall Street Journal analysis of Ms. Kingsbury’s medical bills, insurance statements and newly public data on hospital prices shows how the nation’s seemingly arbitrary hospital pricing left the couple with charges that in some cases would have been far lower for other patients, through no fault of their own.
  • Ms. Kingsbury had insurance, but that’s no guarantee of a competitive price. Hospitals and insurers negotiate prices to hit financial targets, and their bargaining benefits some patients and disadvantages others, according to the Journal’s analysis and interviews with medical billing professionals and researchers.
  • A weak negotiator can get stuck with a lousy deal. Trade-offs can give one insurance plan the best deals for some hospital services, but not others. Hospitals often charge patients the highest rates of all when insurance doesn’t cover their medical care
  • ...15 more annotations...
  • For many patients and their families, hospital fees are already complicated, opaque and stressful. The Kingsburys show just how little control consumers have.
  • None of this has been clear to consumers—until this year. Hospitals and insurers have long set prices through confidential negotiations. Starting Jan. 1, hospitals were required to make their prices public under a Trump administration policy that sought to expose the sector’s pricing to greater market pressure.
  • Compliance with the rule has been spotty, but the available data show that prices vary widely among the plans that negotiate contracts with hospitals. While the data remains difficult for consumers to use, knowing the full range of rates could ultimately help patients negotiate their bills.
  • Healthcare economists note that prices in other sectors, such as airlines, can also vary for the same service, but hospitals’ steep prices mean the dollar difference between the highest and lowest rates can amount to tens of thousands of dollars. “The order of magnitude of healthcare costs is different,”
  • Even within an insurance plan, prices aren’t consistently low or high. A plan’s prices for one service can be among the lowest a hospital negotiates, but among the highest for another,
  • A person insured by Minnesota-based HealthPartners would have received the most favorable price for a hospital stay because of back problems, but the cost of an emergency room visit with the same insurance was among the highest, according to the Journal’s analysis of the data.
  • When insurance didn’t cover some treatments, the Journal found, Avera McKennan Hospital set its own prices that ranked among the highest anywhere in the U.S. in the Journal’s analysis.
  • The LifeShield price of about $780 amounted to a discount of 53% off the hospital’s charge. Ms. Kingsbury paid all of it because her plan’s benefits didn’t cover the rest of the bill. The insurance was exempt from some federal rules that protect healthcare consumers. LifeShield didn’t respond to requests for comment.
  • Ms. Kingsbury earned roughly $17,700 last year, tax records reviewed by the Journal show. Her husband, who is retired, received about $22,800 in yearly income from Social Security. They bought insurance in 2019 from LifeShield National Insurance Co.
  • The range of prices is the product of a complex interplay of multiple payers and hospitals, and a lack of competitive pressure to hold down costs, economists said. Rates have been determined by trade-offs at the bargaining table between hospitals and insurers—such as an offer of cheaper prices in return for more business—and by market power, with higher prices where hospitals dominate.
  • Hospitals and insurers ultimately bargain for prices to meet financial targets for revenue and profit, said David Dillon, a healthcare actuary with the consulting firm Lewis & Ellis Inc. “It is kind of as simple as both sides of the table have their revenue requirements,” he said.
  • “The market for healthcare just doesn’t look at all like the market for tomatoes because somebody else is literally negotiating and purchasing on your behalf,” Mr. Cooper said.
  • The cost for the scan under LifeShield was $1,497, almost half the price charged under Avera. However, Ms. Kingsbury’s plan at LifeShield was exempt from Affordable Care Act rules to prevent gaps in coverage. LifeShield didn’t cover this scan. So Avera charged Ms. Kingsbury the price it sets for patients not covered by insurance, at $8,451, one of the highest prices in the Journal’s analysis of publicly available rates nationwide.
  • “Healthcare is a service and it can be an expensive service, especially for a serious condition. That’s why health insurance exists,” said Avera spokeswoman Ms. Meyers. “It is important for consumers to understand what they are buying and the coverage it provides.”
  • The Journal compared Avera McKennan’s 2019 PET CT price for Ms. Kingsbury with the price Medicare would pay, as calculated by price-comparison startup Turquoise Health Co. The hospital’s cash price for Ms. Kingsbury in 2019 was 5.7 times the Medicare rate, according to the Journal’s analysis using newly public data collected by Turquoise. That’s one of the highest multiples of any of the more than 1,200 U.S. hospitals in the analysis.
Javier E

Opinion | A Better Path to Universal Health Care - The New York Times - 0 views

  • Germany offers a health insurance model that, like Canada’s, results in far less spending than in the United States, while achieving universal, comprehensive coverage
  • this model, pioneered by Chancellor Otto von Bismarck in 1883, was the first social health insurance system in the world. It has since been copied across Europe and Asia, becoming far more common than the Canadian single-payer model.
  • Germans are required to have health insurance, but they can choose between more than 100 private nonprofit insurers called “sickness funds.” Workers and employers share the cost of insurance through payroll taxes, while the government finances coverage for children and the unemployed.
  • ...14 more annotations...
  • Insurance plans are not tied to employers. Services are funded through progressive taxation, so access is based on need, not ability to pay, and financial contributions are based on wealth, not health.
  • Contributions to sickness funds are centrally pooled and then allocated to individual insurers using a per-beneficiary formula that factors in differences in health risks.
  • Editors’ PicksYou Know the Lorena Bobbitt
  • The United States has the foundation for this kind of system. Its Social Security and Medicare systems use taxation to pay for social insurance policies, and the health care exchanges created by the Affordable Care Act provide marketplaces for insurance policies.
  • In Germany, for example, insurers can charge only small out-of-pocket fees limited to 2 percent or less of household income annually
  • Compared with the mostly fee-for-service, single-payer arrangements in Canada or the Medicare system, enrolling Americans in managed care plans paid on a per-patient basis would offer greater incentives to increase efficiency, improve quality of care and promote coordination of care.
  • Under a German-style plan, states could still be given flexibility in regulating nonprofit insurers to reflect regional priorities, similar to the flexibility offered to states in managing Medicaid and the A.C.A. exchanges.
  • Germany, Austria, the Netherlands and other countries with similar systems vastly underspend the United States.
  • Americans may be concerned that lower spending reflects rationing of care, but research has consistently found that not to be the case
  • Administrative and governance costs in multipayer systems are higher than in single-payer systems — 5 percent of health spending in Germany compared with 3 percent in Canada.
  • While recent polls indicate that a majority of Americans support so-called Medicare for all, approval diminishes when the plan is explained or clarified.
  • Americans have long valued choice and competition in their health care. The German model offers both: Patients choose private insurers that compete for enrollees, in the process driving innovation and improving quality.
  • Advocates and policymakers should pick carefully among these paths, choosing one that strikes a balance between what is possible and what is ideal for the United States health system
  • While the single-payer model serves Canada well, transitioning the United States to a multipayer model like Germany’s would require a far smaller leap. And that might encourage Americans to finally make the jump
Javier E

Opinion | Fixing Health Care Starts With the Already Insured - The New York Times - 0 views

  • Health insurance is supposed to provide financial protection against the medical costs of poor health. Yet many insured people still face the risk of enormous medical bills for their “covered” care. A team of researchers estimated that as of mid-2020, collections agencies held $140 billion in unpaid medical bills, reflecting care delivered before the Covid-19 pandemic
  • that’s more than the amount held by collection agencies for all other consumer debt from nonmedical sources combined
  • three-fifths of that debt was incurred by households with health insurance.
  • ...13 more annotations...
  • in any given month, about 11 percent of Americans younger than 65 are uninsured. But more than twice that number — one in four — will be uninsured for at least some time over a two-year period.
  • Perversely, health insurance — the very purpose of which is to provide a measure of stability in an uncertain world — is itself highly uncertain. And while the Affordable Care Act substantially reduced the share of Americans who are uninsured at a given time, we found that it did little to reduce the risk of insurance loss among the currently insured.
  • them do. The experience with the health insurance mandate under the Affordable Care Act makes that clear.
  • The risk of losing coverage is an inevitable consequence of a lack of universal coverage. Whenever there are varied pathways to eligibility, there will be many people who fail to find their path.
  • About six in 10 uninsured Americans are eligible for free or heavily discounted insurance coverage. Yet they remain uninsured. Lack of information about which of the array of programs they are eligible for, along with the difficulties of applying and demonstrating eligibility, mean that the coverage programs are destined to deliver less than they could.
  • incremental reforms won’t work. Over a half-century of such well-intentioned, piecemeal policies has made clear that continuing this approach represents the triumph of hope over experience,
  • The only solution is universal coverage that is automatic, free and basic.
  • Coverage needs to be free at the point of care — no co-pays or deductibles — because leaving patients on the hook for large medical costs is contrary to the purpose of insurance.
  • But it turns out there’s an important practical wrinkle with asking patients to pay even a very small amount for some of their universally covered care: There will always be people who can’t manage even modest co-pays.
  • Finally, coverage must be basic because we are bound by the social contract to provide essential medical care, not a high-end experience.
  • Keeping universal coverage basic will keep the cost to the taxpayer down as well.
  • as a share of its economy, the United States spends about twice as much on health care as other high-income countries. But in most other wealthy countries, this care is primarily financed by taxes, whereas only about half of U.S. health care spending is financed by taxes. For those of you following the math, half of twice as much is … well, the same amount of taxpayer-financed spending on health care as a share of the economy. In other words, U.S. taxes are already paying for the cost of universal basic coverage. Americans are just not getting it. They could be.
  • at a high level, the key elements of our proposal are ones that every high-income country (and all but a few Canadian provinces) has embraced: guaranteed basic coverage and the option for people to purchase upgrades.
Javier E

A Conservative Blueprint for Universal Healthcare | MedPage Today - 0 views

  • In 1989, a policy analyst at a leading conservative Washington, D.C. think tank described a workable planopens in a new tab or window in which private insurers, just as in Germany, provide universal coverage. This plan would:Change the current tax treatment of health insurance (which largely benefits people with employer-based coverage at the expense of lower income Americans)Declare that families face the responsibility of having adequate insuranceOffer government assistance to families unable to afford health coverage on their ownReform the Medicare program
  • The middle planks of this conservative plan ultimately became the Affordable Care Act's (ACA) Marketplacesopens in a new tab or window, where families could purchase health insurance in a new, nationally regulated market with financial subsidies to cover costs for those with incomes below 400% of the federal poverty level (about $92,120opens in a new tab or window for a family of three).
  • President Barack Obama signed the ACA into law 13 years ago today, transforming a patchwork system of individual health insurance markets into one that today could form a national framework for universal healthcare
  • ...2 more annotations...
  • an "ACA for All" system would prevent government from operating health insurance while allowing it to regulate and finance health insurance for most Americans. The ACA for All would not be "socialized medicine" -- where government not only finances healthcare but supplies it through public hospitals, clinics, and the direct employment of clinicians. ACA for All would continue to rely on private industry (private doctors and private hospitals) and personal responsibility, and would limit the government's role in healthcare delivery.
  • A Republican plan for universal healthcare would offer those with non-employer-based coverage an adequately sized tax deduction, big enough to cover the cost of a family health insurance plan. And, for the first time since the 1940s, individuals would pay taxes on the value of employer-based health insurance above a certain threshold (based on the average costopens in a new tab or window of a family health insurance plan).
Javier E

What American Healthcare Can Learn From Germany - Olga Khazan - The Atlantic - 0 views

  • Every German resident must belong to a sickness fund, and in turn the funds must insure all comers. They’re also mandated to cover a standard set of benefits, which includes most procedures and medications. Workers pay half the cost of their sickness fund insurance, and employers pay the rest. The German government foots the bill for the unemployed and for children. There are also limits on out-of-pocket expenses, so it’s rare for a German to go into debt because of medical bills.
  • this is very similar to the health-insurance regime that Americans are now living under, now that the Affordable Care Act is four years old and a few days past its first enrollment deadline.
  • There are, of course, a few key differences. Co-pays in the German system are minuscule, about 10 euros per visit. Even those for hospital stays are laughably small by American standards: Sam payed 40 euro for a three-day stay for a minor operation a few years ago
  • ...24 more annotations...
  • nearly five million Americans fall into what’s called the “Medicaid gap”
  • In Germany, employees' premiums are a percentage of their incomes, so low-wage workers simply pay rock-bottom insurance rates.
  • You can think of this setup as the Goldilocks option among all of the possible ways governments can insure health. It's not as radical as single-payer models like the U.K.’s, where the government covers everyone. And it's also not as brutal as the less-regulated version of the insurance market we had before the ACA.
  • Germany actually pioneered this type of insurance—it all started when Otto von Bismarck signed his Health Insurance Bill of 1883 into law. (It’s still known as the “Bismarck model” because of his legacy, and other parts of Europe and Asia have adopted it over the years.)
  • Since there are no provider networks in Germany, doctors don’t know what other providers patients have seen, so there are few ways to limit repeat procedures.
  • All things considered, it’s good to be a sick German. There are no network limitations, so people can see any doctor they want. There are no deductibles, so Germans have no fear of spending hundreds before their insurance ever kicks in.
  • There’s also no money that changes hands during a medical appointment. Patients show their insurance card at the doctor’s office, and the doctors' association pays the doctor using money from the sickness funds. "You don’t have to sit at home and sort through invoices or wonder if you overlooked fine print,”
  • That insurance card, by the way, is good for hospital visits anywhere in Europe.
  • of all of the countries studied, Germans were the most likely to be able to get a same-day or next-day appointment and to hear back from a doctor quickly if they had a question. They rarely use emergency rooms, and they can access doctors after-hours with ease.
  • And Germany manages to put its health-care dollars to relatively good use: For each $100 it spends on healthcare, it extends life by about four months, according to a recent analysis in the American Journal of Public Health. In the U.S., one of the worst-performing nations in the ranking, each $100 spent on healthcare resulted in only a couple of extra weeks of longevity.
  • those differences aside, it’s fair to say the U.S. is moving in the direction of systems like Germany’s—multi-payer, compulsory, employer-based, highly regulated, and fee-for-service.
  • The German government is similarly trying to push more people into “family physician” programs, in which just one doctor would serve as a gatekeeper.
  • like the U.S., Germany may see a shortage of primary-care doctors in the near future, both because primary-care doctors there don’t get paid as much as specialists, and because entrenched norms have prevented physician assistants from shouldering more responsibility
  • With limitations on how much they can charge, German doctors and hospitals instead try to pump up their earnings by performing as many procedures as possible, just like American providers do.
  • With few resource constraints, healthcare systems like America's and Germany's tend to go with the most expensive treatment option possible. An American might find himself in an MRI machine for a headache that a British doctor would have treated with an aspirin and a smile.
  • Similarly, “In Germany, it will always be an operation,” Göpffarth said. “Meanwhile, France and the U.K. tend to try drugs first and operations later.”
  • Perhaps the biggest difference between our two approaches is the extent to which Germany has managed to rein in the cost of healthcare for consumers. Prices for procedures there are lower and more uniform because doctors’ associations negotiate their fees directly with all of the sickness funds in each state. That's part of the reason why an appendectomy costs $3,093 in Germany, but $13,000 in the U.S.
  • “In Germany, there is a uniform fee schedule for all physicians that work under the social code,” Schlette said. “There’s a huge catalogue where they determine meticulously how much is billed for each procedure. That’s like the Bible.”
  • certain U.S. states have tried a more German strategy, attempting to keep costs low by setting prices across the board. Maryland, for example, has been regulating how much all of the state’s hospitals can charge since 1977. A 2009 study published in Health Affairs found that we would have saved $2 trillion if the entire country’s health costs had grown at the same rate as Maryland’s over the past three decades.
  • Now, Maryland is going a step further still, having just launched a plan to cap the amount each hospital can spend, total, each year. The state's hospital spending growth will be limited to 3.58 percent for the next five years. “We know that right now, the more [doctors] do, the more they get paid,” John Colmers, executive director of Maryland’s Health Services Cost Review Commission, told me. “We want to say, ‘The better you do, the better you get paid.’”
  • “The red states are unlikely to follow their lead. The notion that government may be a big part of the solution, instead of the problem, is anathema, and Republican controlled legislatures, and their governors, would find it too substantial a conflict to pursue with any vigor.”
  • no other state has Maryland’s uniform, German-style payment system in place, “so Maryland starts the race nine paces ahead of the other 46 states,” McDonough said.
  • the unique spirit of each country is what ultimately gets in its way. Germany’s more orderly system can be too rigid for experimentation. And America’s free-for-all, where hospitals and doctors all charge different amounts, is great for innovation but too chaotic to make payment reforms stick.
  • rising health costs will continue to be the main problem for Americans as we launch into our more Bismarckian system. “The main challenge you’ll have is price control,” he said. “You have subsidies in health exchanges now, so for the first time, the federal budget is really involved in health expenditure increases in the commercial market. In order to keep your federal budget under control, you’ll have to control prices.”
Javier E

Trump's war on socialism will fail - The Washington Post - 0 views

  • “We socialists are trying to save capitalism, and the damned capitalists won’t let us.” Political scientist Mason B. Williams cited this cheeky but accurate comment by New Deal lawyer Jerome Frank to make a point easily lost in the new war on socialism that President Trump has launched: Socialism goes back a long way in the United States, and it has taken doses of it to keep the market system alive.
  • there would be no social reform, ever, if those seeking change were too timid to go big and allowed cries of “socialism” to intimidate them.
  • Young Americans especially are far more likely to associate “socialism” with generous social insurance states than with jackboots and gulags. Sweden, Norway and Denmark are anything but frightening places.
  • ...2 more annotations...
  • The 2018 PRRI American Values Survey offered respondents two definitions of socialism. One described it as “a system of government that provides citizens with health insurance, retirement support and access to free higher education,” essentially a description of social democracy. The other was the full Soviet dose: “a system where the government controls key parts of the economy, such as utilities, transportation and communications industries.”
  • You might say that socialism is winning the branding war: Fifty-four percent said socialism was about those public benefits, while just 43 percent picked the version that stressed government domination. Americans ages 18 to 29, for whom Cold War memories are dim to nonexistent, were even more inclined to define socialism as social democracy: Fifty-eight percent of them picked the soft option, 38 percent the hard one.
lmunch

The Political Divide In Health Care: A Liberal Perspective | Health Affairs - 0 views

  • Classical seventeenth-century liberalism, a response to autocratic monarchies, promoted the freedom of the individual. The concepts of equality and the rule of law were added to classical liberal doctrine in the eighteenth century, as expressed in the Declaration of Independence and the Bill of Rights. 1 Eighteenth-century liberalism also advocated a universal humanitarian morality: “It is the goal of morality to substitute peaceful behavior for violence, good faith for fraud and overreaching, considerateness for malice, cooperation for the dog-eat-dog attitude.” 2 These precepts, also in the writings of world religions, are best expressed in the Golden Rule, “Do unto others as you would have others do unto you.”
  • ohn Stuart Mill introduced the utilitarian idea that societies should be responsible to provide the greatest happiness for the greatest number of people. A corollary to this argument was that governments should provide for the overall welfare of the population—a communitarian rather than individualistic strain of liberalism. Liberalism and conservatism went separate ways, with most conservatives advocating that government restrict itself to ensuring individual liberties.
  • “Health care” refers to medical services, but not to a healthy state of being. The right to health care is distinct from the right to health.
  • ...13 more annotations...
  • Rawls deduced that a just society would guarantee personal freedoms as long as they did not impinge on the freedoms of others, would promote equality of opportunity, and would allow inequality only if it would benefit the least advantaged in society.
  • Recently, a neoliberal movement has moved away from New Deal liberalism, partially returning to the classical liberal belief that the free market is the best way to handle societal needs. Neoliberals join conservatives in supporting smaller government and privatization of some New Deal programs.
  • In the health care arena, many liberals feel that governments (although they can be and often are corrupted by power and money) are the only social institutions that can implement the balance between the needs of each individual and those of all individuals—that is, the community.
  • Neoconservatives believe in an aggressive U.S. foreign policy with a strong military, at times placing them at odds with fiscal conservatives. Most conservatives support small government and low taxes and oppose progressive and corporate taxes, believing that economic health is best guaranteed by wealthy individuals and corporations having money to invest in job creation.
  • “Right” means that the government guarantees something to everyone. Rights come in two categories: individual freedoms and population-based entitlements.
  • The nineteenth century also saw the growth of social democracy, a brand of liberalism arguing that the market cannot supply certain human necessities: a minimum income to purchase food, clothes, and housing, and access to health services; governments are needed to guarantee those needs.
  • The liberal belief in health care as a right is based on two varieties of liberal thinking, as noted in the discussion of liberalism above: (1) the social justice argument advanced by Rawls that anyone unaware of his/her position in society would agree with health care as a right because it promotes equality of opportunity and is of the greatest benefit to the least advantaged members of society; and (2) the utilitarian view that guaranteeing health services increases the welfare of the greatest number of people.
  • If health care is just another commodity, it can be supplied by the market; if a necessity, the market is not adequate.
  • One caveat concerns the impact of taxes on public opinion. A 1994 survey found that fewer than half of respondents would pay more taxes to finance universal health insurance.
  • “socialized medicine,” meaning government ownership of health care delivery institutions; social insurance of the single-payer variety is socialized insurance but not socialized medicine.
  • Liberal doctrine argues that social insurance unites the entire population into a single risk pool. The 80 percent of the population that incurs only 20 percent of national health spending pays for the 20 percent who account for 80 percent of spending.
  • The health care system is now financed in a regressive manner. Out-of-pocket payments (about 15 percent of health care spending) consume more than 10 percent of the income of families in the lowest income quintile, compared with about 1 percent for families in the wealthiest 5 percent of the population.
  • Private health insurance is also a regressive method of financing health care because employer-paid insurance premiums are generally considered deductions from wages or salary, and a premium represents a higher proportion of income for lower-paid employees than for those with higher pay. 27 Moreover, the tax deductions for employer coverage benefit the higher-income.
Javier E

The Dark Power of Fraternities - The Atlantic - 0 views

  • College fraternities—by which term of art I refer to the formerly all-white, now nominally integrated men’s “general” or “social” fraternities, and not the several other types of fraternities on American campuses (religious, ethnic, academic)—are as old, almost, as the republic.
  • While the system has produced its share of poets, aesthetes, and Henry James scholars, it is far more famous for its success in the powerhouse fraternity fields of business, law, and politics. An astonishing number of CEOs of Fortune 500 companies, congressmen and male senators, and American presidents have belonged to fraternities
  • They also have a long, dark history of violence against their own members and visitors to their houses, which makes them in many respects at odds with the core mission of college itself.
  • ...62 more annotations...
  • A recent series of articles on fraternities by Bloomberg News’s David Glovin and John Hechinger notes that since 2005, more than 60 people—the majority of them students—have died in incidents linked to fraternities, a sobering number in itself, but one that is dwarfed by the numbers of serious injuries, assaults, and sexual crimes that regularly take place in these houses.
  • I have spent most of the past year looking deeply into the questions posed by these lawsuits, and more generally into the particular nature of fraternity life on the modern American campus
  • to answer the vexing question “why don’t colleges just get rid of their bad fraternities?”—the system, and its individual frats, have only grown in power and influence. Indeed, in many substantive ways, fraternities are now mightier than the colleges and universities that host them.
  • The entire multibillion-dollar, 2,000-campus American college system
  • the Kappa Alpha Society. Word of the group spread, and a new kind of college institution was founded, and with it a brand-new notion: that going to college could include some pleasure. It was the American age of societies, and this new type fit right in.
  • every moment of the experience is sweetened by the general understanding that with each kegger and rager, each lazy afternoon spent snoozing on the quad (a forgotten highlighter slowly drying out on the open pages of Introduction to Economics, a Coke Zero sweating beside it), they are actively engaged in the most significant act of self-improvement available to an American young person: college!
  • There are many thousands of American undergraduates whose economic futures (and those of their parents) would be far brighter if they knocked off some of their general-education requirements online, or at the local community college—for pennies on the dollar—before entering the Weimar Republic of traditional-college pricing. But college education, like weddings and funerals, tends to prompt irrational financial decision making,
  • depends overwhelmingly for its very existence on one resource: an ever-renewing supply of fee-paying undergraduates. It could never attract hundreds of thousands of them each year—many of them woefully unprepared for the experience, a staggering number (some 40 percent) destined never to get a degree, more than 60 percent of them saddled with student loans that they very well may carry with them to their deathbeds—if the experience were not accurately marketed as a blast.
  • When colleges tried to shut them down, fraternities asserted that any threat to men’s membership in the clubs constituted an infringement of their right to freedom of association. It was, at best, a legally delicate argument, but it was a symbolically potent one, and it has withstood through the years. The powerful and well-funded political-action committee that represents fraternities in Washington has fought successfully to ensure that freedom-of-association language is included in all higher-education reauthorization legislation, thus “disallowing public Universities the ability to ban fraternities.”
  • While the fraternities continued to exert their independence from the colleges with which they were affiliated, these same colleges started to develop an increasingly bedeviling kind of interdependence with the accursed societies
  • the fraternities involved themselves very deeply in the business of student housing, which provided tremendous financial savings to their host institutions, and allowed them to expand the number of students they could admit. Today, one in eight American students at four-year colleges lives in a Greek house
  • fraternities tie alumni to their colleges in a powerful and lucrative way. At least one study has affirmed what had long been assumed: that fraternity men tend to be generous to their alma maters. Furthermore, fraternities provide colleges with unlimited social programming of a kind that is highly attractive to legions of potential students
  • It is true that fraternity lawsuits tend to involve at least one, and often more, of the four horsemen of the student-life apocalypse, a set of factors that exist far beyond frat row
  • the binge-drinking epidemic, which anyone outside the problem has a hard time grasping as serious (everyone drinks in college!) and which anyone with knowledge of the current situation understands as a lurid and complicated disaster
  • The second is the issue of sexual assault of female undergraduates by their male peers, a subject of urgent importance but one that remains stubbornly difficult even to quantify
  • The third is the growing pervasiveness of violent hazing on campus
  • But it’s impossible to examine particular types of campus calamity and not find that a large number of them cluster at fraternity houses
  • the fourth is the fact that Boomers, who in their own days destroyed the doctrine of in loco parentis so that they could party in blissful, unsupervised freedom, have grown up into the helicopter parents of today
  • during the period of time under consideration, serious falls from fraternity houses on the two Palouse campuses far outnumbered those from other types of student residences, including privately owned apartments occupied by students. I began to view Amanda Andaverde’s situation in a new light.
  • Why are so many colleges allowing students to live and party in such unsafe locations? And why do the lawsuits against fraternities for this kind of serious injury and death—so predictable and so preventable—have such a hard time getting traction? The answers lie in the recent history of fraternities and the colleges and universities that host them.
  • This question is perhaps most elegantly expressed in the subtitle of Robert D. Bickel and Peter F. Lake’s authoritative 1999 book on the subject, The Rights and Responsibilities of the Modern University: Who Assumes the Risks of College Life?
  • The answer to this question has been steadily evolving ever since the 1960s, when dramatic changes took place on American campuses, changes that affected both a university’s ability to control student behavior and the status of fraternities in the undergraduate firmament. During this period of student unrest, the fraternities—long the unquestioned leaders in the area of sabotaging or ignoring the patriarchal control of school administrators—became the exact opposite: representatives of the very status quo the new activists sought to overthrow. Suddenly their beer bashes and sorority mixers, their panty raids and obsession with the big game, seemed impossibly reactionary when compared with the mind-altering drugs being sampled in off-campus apartments where sexual liberation was being born and the Little Red Book proved, if nothing else, a fantastic coaster for a leaky bong.
  • American colleges began to regard their students not as dependents whose private lives they must shape and monitor, but as adult consumers whose contract was solely for an education, not an upbringing. The doctrine of in loco parentis was abolished at school after school.
  • Through it all, fraternities—for so long the repositories of the most outrageous behavior—moldered, all but forgotten.
  • Animal House, released in 1978, at once predicted and to no small extent occasioned the roaring return of fraternity life that began in the early ’80s and that gave birth to today’s vital Greek scene
  • In this newly forming culture, the drugs and personal liberation of the ’60s would be paired with the self-serving materialism of the ’80s, all of which made partying for its own sake—and not as a philosophical adjunct to solving some complicated problem in Southeast Asia—a righteous activity for the pampered young collegian. Fraternity life was reborn with a vengeance.
  • These new members and their countless guests brought with them hard drugs, new and ever-developing sexual attitudes, and a stunningly high tolerance for squalor
  • Adult supervision was nowhere to be found. Colleges had little authority to intervene in what took place in the personal lives of its students visiting private property. Fraternities, eager to provide their members with the independence that is at the heart of the system—and responsive to members’ wish for the same level of freedom that non-Greek students enjoyed—had largely gotten rid of the live-in resident advisers who had once provided some sort of check on the brothers
  • , in 1984 Congress passed the National Minimum Drinking Age Act, with the ultimate result of raising the legal drinking age to 21 in all 50 states. This change moved college partying away from bars and college-sponsored events and toward private houses—an ideal situation for fraternities
  • lawsuits began to pour in.
  • Liability insurance became both ruinously expensive and increasingly difficult to obtain. The insurance industry ranked American fraternities as the sixth-worst insurance risk in the country—just ahead of toxic-waste-removal companies.
  • For fraternities to survive, they needed to do four separate but related things: take the task of acquiring insurance out of the hands of the local chapters and place it in the hands of the vast national organizations; develop procedures and policies that would transfer as much of their liability as possible to outside parties; find new and creative means of protecting their massive assets from juries; and—perhaps most important of all—find a way of indemnifying the national and local organizations from the dangerous and illegal behavior of some of their undergraduate members.
  • comprising a set of realities you should absolutely understand in detail if your son ever decides to join a fraternity.
  • you may think you belong to Tau Kappa Epsilon or Sigma Nu or Delta Tau Delta—but if you find yourself a part of life-changing litigation involving one of those outfits, what you really belong to is FIPG, because its risk-management policy (and your adherence to or violation of it) will determine your fate far more than the vows you made during your initiation ritual
  • the need to manage or transfer risk presented by alcohol is perhaps the most important factor in protecting the system’s longevity. Any plaintiff’s attorney worth his salt knows how to use relevant social-host and dramshop laws against a fraternity; to avoid this kind of liability, the fraternity needs to establish that the young men being charged were not acting within the scope of their status as fraternity members. Once they violated their frat’s alcohol policy, they parted company with the frat.
  • there are actually only two FIPG-approved means of serving drinks at a frat party. The first is to hire a third-party vendor who will sell drinks and to whom some liability—most significant, that of checking whether drinkers are of legal age—will be transferred. The second and far more common is to have a BYO event, in which the liability for each bottle of alcohol resides solely in the person who brought it.
  • these policies make it possible for fraternities to be the one industry in the country in which every aspect of serving alcohol can be monitored and managed by people who are legally too young to drink it.
  • But when the inevitable catastrophes do happen, that policy can come to seem more like a cynical hoax than a real-world solution to a serious problem.
  • Thanks in part to the guest/witness list, Larry can be cut loose, both from the expensive insurance he was required to help pay for (by dint of his dues) as a precondition of membership, and from any legal defense paid for by the organization. What will happen to Larry now?
  • “I’ve recovered millions and millions of dollars from homeowners’ policies,” a top fraternal plaintiff’s attorney told me. For that is how many of the claims against boys who violate the strict policies are paid: from their parents’ homeowners’ insurance
  • , the Fraternal Information and Programming Group’s chillingly comprehensive crisis-management plan was included in its manual for many years
  • the plan serves a dual purpose, at once benevolent and mercenary. The benevolent part is accomplished by the clear directive that injured parties are to receive immediate medical attention, and that all fraternity brothers who come into contact with the relevant emergency workers are to be completely forthright
  • “Until proven otherwise,” Fierberg told me in April of fraternities, “they all are very risky organizations for young people to be involved in.” He maintains that fraternities “are part of an industry that has tremendous risk and a tremendous history of rape, serious injury, and death, and the vast majority share common risk-management policies that are fundamentally flawed. Most of them are awash in alcohol. And most if not all of them are bereft of any meaningful adult supervision.”
  • the interests of the national organization and the individual members cleave sharply as this crisis-management plan is followed. Those questionnaires and honest accounts—submitted gratefully to the grown-ups who have arrived, the brothers believe, to help them—may return to haunt many of the brothers, providing possible cause for separating them from the fraternity, dropping them from the fraternity’s insurance, laying the blame on them as individuals and not on the fraternity as the sponsoring organization.
  • So here is the essential question: In the matter of these disasters, are fraternities acting in an ethical manner, requiring good behavior from their members and punishing them soundly for bad or even horrific decisions? Or are they keeping a cool distance from the mayhem, knowing full well that misbehavior occurs with regularity (“most events take place at night”) and doing nothing about it until the inevitable tragedy occurs, at which point they cajole members into incriminating themselves via a crisis-management plan presented as being in their favor?
  • I have had long and wide-ranging conversations with both men, in which each put forth his perspective on the situation.
  • the young men who typically rush so gratefully into the open arms of the representatives from their beloved national—an outfit to which they have pledged eternal allegiance—would be far better served by not talking to them at all, by walking away from the chapter house as quickly as possible and calling a lawyer.
  • The fraternity system, he argues, is “the largest industry in this country directly involved in the provision of alcohol to underage people.” The crisis-management plans reveal that in “the foreseeable future” there may be “the death or serious injury” of a healthy young person at a fraternity function.
  • His belief is that what’s tarnishing the reputation of the fraternities is the bad behavior of a very few members, who ignore all the risk-management training that is requisite for membership, who flout policies that could not be any more clear, and who are shocked when the response from the home office is not to help them cover their asses but to ensure that—perhaps for the first time in their lives—they are held 100 percent accountable for their actions.
  • The fraternity system, he argues, is “the largest industry in this country directly involved in the provision of alcohol to underage people.” The crisis-management plans reveal that in “the foreseeable future” there may be “the death or serious injury” of a healthy young person at a fraternity function.
  • His belief is that what’s tarnishing the reputation of the fraternities is the bad behavior of a very few members, who ignore all the risk-management training that is requisite for membership, who flout policies that could not be any more clear, and who are shocked when the response from the home office is not to help them cover their asses but to ensure that—perhaps for the first time in their lives—they are held 100 percent accountable for their actions.
  • Unspoken but inherent in this larger philosophy is the idea that it is in a young man’s nature to court danger and to behave in a foolhardy manner; the fraternity experience is intended to help tame the baser passions, to channel protean energies into productive endeavors such as service, sport, and career preparation.
  • In a sense, Fierberg, Smithhisler, and the powerful forces they each represent operate as a check and balance on the system. Personal-injury lawsuits bring the hated media attention and potential financial losses that motivate fraternities to improve. It would be a neat, almost a perfect, system, if the people wandering into it were not young, healthy college students with everything to lose.
  • In a sense, Fierberg, Smithhisler, and the powerful forces they each represent operate as a check and balance on the system. Personal-injury lawsuits bring the hated media attention and potential financial losses that motivate fraternities to improve. It would be a neat, almost a perfect, system, if the people wandering into it were not young, healthy college students with everything to lose.
  • Wesleyan is one of those places that has by now become so hard to get into that the mere fact of attendance is testament, in most cases, to a level of high-school preparation—combined with sheer academic ability—that exists among students at only a handful of top colleges in this country and that is almost without historical precedent.
  • Wesleyan is one of those places that has by now become so hard to get into that the mere fact of attendance is testament, in most cases, to a level of high-school preparation—combined with sheer academic ability—that exists among students at only a handful of top colleges in this country and that is almost without historical precedent.
  • This January, after publishing a withering series of reports on fraternity malfeasance, the editors of Bloomberg.com published an editorial with a surprising headline: “Abolish Fraternities.” It compared colleges and universities to companies, and fraternities to units that “don’t fit into their business model, fail to yield an adequate return or cause reputational harm.”
  • A college or university can choose, as Wesleyan did, to end its formal relationship with a troublesome fraternity, but—if that fiasco proves anything—keeping a fraternity at arm’s length can be more devastating to a university and its students than keeping it in the fold.
  • A college or university can choose, as Wesleyan did, to end its formal relationship with a troublesome fraternity, but—if that fiasco proves anything—keeping a fraternity at arm’s length can be more devastating to a university and its students than keeping it in the fold.
  • there is a Grand Canyon–size chasm between the official risk-management policies of the fraternities and the way life is actually lived in countless dangerous chapters.
  • When there is a common denominator among hundreds of such injuries and deaths, one that exists across all kinds of campuses, from private to public, prestigious to obscure, then it is more than newsworthy: it begins to approach a national scandal.
  • When there is a common denominator among hundreds of such injuries and deaths, one that exists across all kinds of campuses, from private to public, prestigious to obscure, then it is more than newsworthy: it begins to approach a national scandal.
Javier E

Opinion | Vaccine Hesitancy Is About Trust and Class - The New York Times - 0 views

  • The world needs to address the root causes of vaccine hesitancy. We can’t go on believing that the issue can be solved simply by flooding skeptical communities with public service announcements or hectoring people to “believe in science.”
  • For the past five years, we’ve conducted surveys and focus groups abroad and interviewed residents of the Bronx to better understand vaccine avoidance.
  • We’ve found that people who reject vaccines are not necessarily less scientifically literate or less well-informed than those who don’t. Instead, hesitancy reflects a transformation of our core beliefs about what we owe one another.
  • ...43 more annotations...
  • Over the past four decades, governments have slashed budgets and privatized basic services. This has two important consequences for public health
  • First, people are unlikely to trust institutions that do little for them.
  • second, public health is no longer viewed as a collective endeavor, based on the principle of social solidarity and mutual obligation. People are conditioned to believe they’re on their own and responsible only for themselves.
  • an important source of vaccine hesitancy is the erosion of the idea of a common good.
  • compared with white Americans, communities of color do experience the American health care system differently. But a closer look at the data reveals a more complicated picture.
  • Since the spring, when most American adults became eligible for Covid vaccines, the racial gap in vaccination rates between Black and white people has been halved. In September, a national survey found that vaccination rates among Black and white Americans were almost identical.
  • Other surveys have determined that a much more significant factor was college attendance: Those without a college degree were the most likely to go unvaccinated.
  • Education is a reliable predictor of socioeconomic status, and other studies have similarly found a link between income and vaccination.
  • It turns out that the real vaccination divide is class.
  • “People are thinking, ‘If the government isn’t going to do anything for us,’” said Elden, “‘then why should we participate in vaccines?’”
  • during the 1950s polio campaigns, for example, most people saw vaccination as a civic duty.
  • But as the public purse shrunk in the 1980s, politicians insisted that it’s no longer the government’s job to ensure people’s well-being; instead, Americans were to be responsible only for themselves and their own bodies
  • Entire industries, such as self-help and health foods, have sprung up on the principle that the key to good health lies in individuals making the right choices.
  • Without an idea of the common good, health is often discussed using the language of “choice.”
  • there are problems with reducing public health to a matter of choice. It gives the impression that individuals are wholly responsible for their own health.
  • This is despite growing evidence that health is deeply influenced by factors outside our control; public health experts now talk about the “social determinants of health,” the idea that personal health is never simply just a reflection of individual lifestyle choices, but also the class people are born into, the neighborhood they grew up in and the race they belong to.
  • food deserts and squalor are not easy problems to solve — certainly not by individuals or charities — and they require substantial government action.
  • Many medical schools teach “motivational interviewing,”
  • the deeper problem:
  • Being healthy is not cheap. Studies indicate that energy-dense foods with less nutritious value are more affordable, and low-cost diets are linked to obesity and insulin resistance.
  • Another problem with reducing well-being to personal choice is that this treats health as a commodity.
  • This isn’t surprising, since we shop for doctors and insurance plans the way we do all other goods and services
  • mothers devoted many hours to “researching” vaccines, soaking up parental advice books and quizzing doctors. In other words, they act like savvy consumers
  • When thinking as a consumer, people tend to downplay social obligations in favor of a narrow pursuit of self-interest. As one parent told Reich, “I’m not going to put my child at risk to save another child.”
  • Such risk-benefit assessments for vaccines are an essential part of parents’ consumer research.
  • Vaccine uptake is so high among wealthy people because Covid is one of the gravest threats they face. In some wealthy Manhattan neighborhoods, for example, vaccination rates run north of 90 percent.
  • For poorer and working-class people, though, the calculus is different: Covid-19 is only one of multiple grave threats.
  • When viewed in the context of the other threats they face, Covid no longer seems uniquely scary.
  • Most of the people we interviewed in the Bronx say they are skeptical of the institutions that claim to serve the poor but in fact have abandoned them.
  • he and his friends find reason to view the government’s sudden interest in their well-being with suspicion. “They are over here shoving money at us,” a woman told us, referring to a New York City offer to pay a $500 bonus to municipal workers to get vaccinated. “And I’m asking, why are you so eager, when you don’t give us money for anything else?”
  • These views reinforce the work of social scientists who find a link between a lack of trust and inequality. And without trust, there is no mutual obligation, no sense of a common good.
  • The experience of the 1960s suggests that when people feel supported through social programs, they’re more likely to trust institutions and believe they have a stake in society’s health.
  • While the reasons vary by country, the underlying causes are the same: a deep mistrust in local and international institutions, in a context in which governments worldwide have cut social services.
  • In one Syrian city, for example, the health care system now consists of one public hospital so underfunded that it is notorious for poor care, a few private hospitals offering high-quality care that are unaffordable to most of the population, and many unlicensed and unregulated private clinics — some even without medical doctors — known to offer misguided health advice. Under such conditions, conspiracy theories can flourish; many of the city’s residents believe Covid vaccines are a foreign plot.
  • In many developing nations, international aid organizations are stepping in to offer vaccines. These institutions are sometimes more equitable than governments, but they are often oriented to donor priorities, not community needs.
  • “We have starvation and women die in childbirth.” one tribal elder told us, “Why do they care so much about polio? What do they really want?”
  • In America, anti-vaccine movements are as old as vaccines themselves; efforts to immunize people against smallpox prompted bitter opposition in the turn of the last century. But after World War II, these attitudes disappeared. In the 1950s, demand for the polio vaccine often outstripped supply, and by the late 1970s, nearly every state had laws mandating vaccinations for school with hardly any public opposition.
  • What changed? This was the era of large, ambitious government programs like Medicare and Medicaid.
  • The anti-measles policy, for example, was an outgrowth of President Lyndon Johnson’s Great Society and War on Poverty initiatives.
  • Research shows that private systems not only tend to produce worse health outcomes than public ones, but privatization creates what public health experts call “segregated care,” which can undermine the feelings of social solidarity that are critical for successful vaccination drives
  • Only then do the ideas of social solidarity and mutual obligation begin to make sense.
  • The types of social programs that best promote this way of thinking are universal ones, like Social Security and universal health care.
  • If the world is going to beat the pandemic, countries need policies that promote a basic, but increasingly forgotten, idea: that our individual flourishing is bound up in collective well-being.
anonymous

Opinion | The Coronavirus Has Laid Bare the Inequality of America's Health Care - The N... - 0 views

  • The notion of price control is anathema to health care companies. It threatens their basic business model, in which the government grants them approvals and patents, pays whatever they ask, and works hand in hand with them as they deliver the worst health outcomes at the highest costs in the rich world.
  • The American health care industry is not good at promoting health, but it excels at taking money from all of us for its benefit. It is an engine of inequality.
  • the virus also provides an opportunity for systemic change. The United States spends more than any other nation on health care, and yet we have the lowest life expectancy among rich countries. And although perhaps no system can prepare for such an event, we were no better prepared for the pandemic than countries that spend far less.
  • ...25 more annotations...
  • One way or another, everyone pays for health care. It accounts for about 18 percent of G.D.P. — nearly $11,000 per person. Individuals directly pay about a quarter, the federal and state governments pay nearly half, and most of the rest is paid by employers.
  • Many Americans think their health insurance is a gift from their employers — a “benefit” bestowed on lucky workers by benevolent corporations. It would be more accurate to think of employer-provided health insurance as a tax.
  • Rising health care costs account for much of the half-century decline in the earnings of men without a college degree, and contribute to the decline in the number of less-skilled jobs.
  • Employer-based health insurance is a wrecking ball, destroying the labor market for less-educated workers and contributing to the rise in “deaths of despair.”
  • We face a looming trillion-dollar federal deficit caused almost entirely by the rising costs of Medicaid and Medicare, even without the recent coronavirus relief bill.
  • Rising costs are an untenable burden on our government, too. States’ payments for Medicaid have risen from 20.5 percent of their spending in 2008 to 28.9 percent in 2019. To meet those rising costs, states have cut their financing for roads, bridges and state universities. Without those crucial investments, the path to success for many Americans is cut off
  • Every year, the United States spends $1 trillion more than is needed for high quality care.
  • executives at hospitals, medical device makers and pharmaceutical companies, and some physicians, are very well paid.
  • American doctors control access to their profession through a system that limits medical school admissions and the entry of doctors trained abroad — an imbalance that was clear even before the pandemic
  • Hospitals, many of them classified as nonprofits, have consolidated, with monopolies over health care in many cities, and they have used that monopoly power to raise prices
  • These are all strategies that lawmakers and regulators could put a stop to, if they choose.
  • The health care industry has armored itself, employing five lobbyists for each elected member of Congress. But public anger has been building — over drug prices, co-payments, surprise medical bills — and now, over the fragility of our health care system, which has been laid bare by the pandemic
  • A single-payer system is just one possibility. There are many systems in wealthy countries to choose from, with and without insurance companies, with and without government-run hospitals. But all have two key characteristics: universal coverage — ideally from birth — and cost control.
  • In the United States, public funding is likely to play a significant role in any treatments or vaccines that are eventually developed for Covid-19. Americans should demand that they be available at a reasonable price to everyone — not in the sole interest of drug companies.
  • We are believers in free-market capitalism, but health care is not something it can deliver in a socially tolerable way.
  • They choose not to. And so we Americans have too few doctors, too few beds and too few ventilators — but lots of income for providers
  • America is a rich country that can afford a world-class health care system. We should be spending a lot of money on care and on new drugs. But we need to spend to save lives and reduce sickness, not on expensive, income-generating procedures that do little to improve health. Or worst of all, on enriching pharma companies that feed the opioid epidemic.
  • Medical device manufacturers have also consolidated, in some cases using a “catch and kill” strategy to swallow up nimbler start-ups and keep the prices of their products high.
  • Ambulance services and emergency departments that don’t accept insurance have become favorites of private equity investors because of their high profits
  • Britain, for example, has the National Institute for Health and Care Excellence, which vets drugs, devices and procedures for their benefit relative to cost
  • At the very least, America must stop financing health care through employer-based insurance, which encourages some people to work but it eliminates jobs for less-skilled workers
  • Our system takes from the poor and working class to generate wealth for the already wealthy.
  • passed a coronavirus bill including $3.1 billion to develop and produce drugs and vaccines.
  • The industry might emerge as a superhero of the war against Covid-19, like the Royal Air Force in the Battle of Britain during World War II.
  • illions have lost their paychecks and their insurance
Javier E

The Court Affirms Our Social Contract - The Atlantic - 0 views

  • the federal courts are the guardians of our Constitution. That is certainly true, but it not the whole story. In fact, the most important function of the federal courts is to legitimate state building by the political branches.
  • What is "state building?" Throughout our country's history, government has taken on many new functions. The early 19th century American state actually didn't do very much more than national defense and customs collection. The executive branch was tiny. Over the years, the federal government took on more and more obligations, offering new protections and new services for its citizens. After the Civil War, Congress passed a series of civil rights laws, it created the Interstate Commerce Commission to regulate railroads, it passed an income tax, and early in the twentieth century it created a central bank. State building really took off after the New Deal, which established the modern administrative and regulatory state and added a host of labor and consumer protection regulations, investments in infrastructure, and Social Security. The National Security State was born after World War II, and the 1960s brought new civil rights laws and new social welfare programs through the Great Society. At the turn of the 21st century, the federal government expanded its national security infrastructure even further, implementing vast new surveillance programs and strategies for dealing with terrorism
  • Whenever the federal government expands its capabilities, it changes the nature of the social compact. Sometimes the changes are small, but sometimes, as in the New Deal or the civil rights era, the changes are big. And when the changes are big, courts are called on to legitimate the changes and ensure that they are consistent with our ancient Constitution.
  • ...5 more annotations...
  • The words "legitmate" and "ratify," however, are ambiguous terms. Courts do not simply rubber stamp what the political branches do. Rather, they set new ground rules. The government may do this as long as it doesn't do that. Legitimation is Janus-faced: it establishes what government can do by establishing what the government cannot do.
  • The real constitutional struggle begins in 1968, when Richard Nixon appointed four new conservative justices to the Court in his first term. These new justices accepted and ratified the changes of the 1960s, but also limited them in important ways. They made clear that the welfare state was constitutionally permissible but not constitutionally required, held that education was not a fundamental right, limited the use of busing to achieve racial integration, and halted the Warren Court's revolution in criminal procedure. The changes in social contract were ratified, but on more conservative terms.
  • Roberts held that the individual mandate could not be justified by Congress's power to regulate interstate commerce. If it was constitutional, it was only as a tax, which gave people a choice to purchase health insurance or pay a small penalty. As I have argued for many years, this is, in fact, the correct interpretation of what the mandate does. Once this point is accepted, the argument for the mandate's constitutionality is straightforward, and Roberts quickly showed why this was true.
  • Roberts' reasoning captures the dual nature of judicial legitimation. He has said to Congress: "You may compel people to enter into commercial transactions like the insurance mandate, but you may not do so as a direct order under the commerce power. Instead, you must do it through the taxing power, always giving people the choice to pay a tax instead. And as long as you structure the mandate as a tax, the people's rights are protected because they always have the right to throw their elected representatives out of office if they don't like the tax." Roberts' opinion thus harks back to a basic source of legitimacy enshrined in the American Revolution: "No taxation without representation."
  • the Medicaid extension. He argued that Congress may create new social programs that expand protection for the poor. But Congress may not tell states that they must accept the new programs or else lose all federal contributions to existing social programs of long standing. The federal government may, if it wants, totally fund the Medicaid extension out of its own pocket without any help from the states. It may abolish the old version of Medicaid and create a new version in its place identical to the expanded version. What it may not do, Roberts argued, is to leverage States' dependence on federal money in established social welfare programs to compel States to participate in new social welfare programs.
Javier E

The Young Left Is a Third Party - The Atlantic - 0 views

  • Americans 55 and up account for less than one-third of the population, but they own two-thirds of the nation’s wealth, according to the Federal Reserve. That’s the highest level of elderly wealth concentration on record. The reason is simple: To an unprecedented degree, older Americans own the most valuable real estate and investment portfolios. They’ve captured more than 80 percent of stock-market growth since the end of the Great Recession.
  • under the age of 40, for their part, are historically well educated, historically peaceful, and historically law-abiding
  • “In the U.S, as in the U.K. and in much of Europe, 2008 was the end of the end of history,” says Keir Milburn, the author of Generation Left, a book on young left-wing movements. “The last decade in the U.K. has been the worst decade for wage growth for 220 years. In the U.S., this generation is the first in a century that expects to have lower lifetime earnings than their parents. It has created an epochal shift.”
  • ...21 more annotations...
  • Young Americans demanding more power, control, and justice have veered sharply to the left. This lurch was first evident in the two elections of Barack Obama, when he won the youth vote by huge margins
  • Obama won about 60 percent of voters younger than 30 in the 2008 primary. Bernie Sanders won more than 70 percent of under-30 voters in the 2016 primary, which pushed Hillary Clinton to the left and dragged issues like Medicare for All and free college from the fringe to the mainstream of political debate.
  • Joe Biden polled at 2 percent among voters under 30, within the margin of error of zero. Nationally, he is in single digits among Millennials, the generation born between 1981 and 1996. Yet Biden is the Democratic front-runner for the 2020 presidential nomination, thanks to his huge advantage among old voters and black voters, who are considerably more moderate than younger Democrats.
  • Bernie Sanders, by contrast, leads all candidates among voters under 30 and polls just 5 percent among voters over 65
  • age divides young leftists from both Republicans and Democrats. Democrats under 30 have almost no measurable interest in the party’s front-runner. Democrats over 65 have almost no measurable interest in the favored candidate of the younger generation.
  • age—perhaps even more than class or race—is now the most important fault line within the Democratic Party. 2c 2c 2c 2c 2c 2c 2c 2c 2c
  • It might be most useful to think about 2cyoung progressives as a third party trapped in a two-party system.
  • they are a powerful movement politically domiciled within a larger coalition of moderate older minorities and educated suburbanites, who don’t always know what to do with their rambunctious bunkmates.
  • this progressive third party’s platform look like?
  • justice: Social justice, sought through a reappraisal of power relationships in social and corporate life, and economic justice, sought through the redistribution of income from the rich to the less fortunate.
  • This group’s support for Medicare for All, free college, and student-debt relief is sometimes likened to a “give me free stuff” movement.
  • every movement wants free stuff, if by free stuff one means “stuff given preferential treatment in the tax code.” By this definition, Medicare is free stuff, and investment income is free stuff, and suburban home values propped up by the mortgage-interest deduction are free stuff. The free stuff in the tax code today benefits Americans with income and wealth—a population that is disproportionately old.
  • Medicare for All might be politically infeasible, but it is, taken literally, a request that the federal government extend to the entire population the insurance benefits now exclusively reserved for the elderly. That’s not hatred or resentment; it sounds more like justice.
  • across ethnicities, many Americans have a deep aversion to anything that can be characterized as “political correctness” or “socialism.”
  • this might be the biggest challenge for the young progressive agenda
  • While Medicare for All often polls well, its public support is exquisitely sensitive to framing. According to the Kaiser Family Foundation, the net favorability of eliminating private insurance or requiring most Americans to pay more in taxes—both part of the Sanders plan—is negative-23 points.
  • The young left’s deep skepticism toward capitalism simply isn’t shared by previous generations.
  • Gen X is firmly pro-capitalist and Baby Boomers, who came of age during the Cold War, prefer capitalism over socialism by a two-to-one margin.
  • Social Security and Medicare are, essentially, socialism for the old, but that’s not the same as converting them into Berniecrats.)
  • “This is only the halfway point of an epochal change in Western politics following the Great Recession,” Keir Milburn says. The far right has responded with calls for xenophobic nationalism to preserve national identity, while the left has responded with calls for social democracy to restore socioeconomic justice
  • the far right is ascendant, but they have no answer to the future because they’ve given up on the future. The young left has identified that the future of adulthood no longer feels viable to many people, and it’s putting together a different vision.”
blairca

Money Is the Oxygen on Which the Fire of Global Warming Burns | The New Yorker - 0 views

  • This spring, we set another high mark for carbon dioxide in the atmosphere: four hundred and fifteen parts per million, higher than it has been in many millions of years.
  • Last fall, the world’s climate scientists said that, if we are to meet the goals we set in the 2015 Paris climate accord—which would still raise the mercury fifty per cent higher than it has already climbed—we’ll essentially need to cut our use of fossil fuels in half by 2030 and eliminate them altogether by mid-century.
  • But we’re moving far too slowly to exploit the opening for rapid change that this feat of engineering offers. Hence the 2 A.M. dread.
  • ...15 more annotations...
  • And the trend is remarkable: in the three years since the signing of the Paris climate accord, which was designed to help the world shift away from fossil fuels, the banks’ lending to the industry has increased every year, and much of the money goes toward the most extreme forms of energy development.
  • Political change usually involves slow compromise, and that’s in a working system, not a dysfunctional gridlock such as the one we now have in Washington.
  • I suspect that the key to disrupting the flow of carbon into the atmosphere may lie in disrupting the flow of money to coal and oil and gas.
  • And, if the world were to switch decisively to solar and wind power, Chase would lend to renewable-energy companies, too. Indeed, it already does, though on a much smaller scale.
  • The same is true of the asset-management and insurance industries: without them, the fossil-fuel companies would almost literally run out of gas, but BlackRock and Chubb could survive without their business.
  • The terminal will spit out the current league tables, which rank loan volume: showing, for example, which banks are lending the most money to railroad builders or to copper miners—or to fossil-fuel companies.
  • we need to do more, for the simple reason that they may not pay off fast enough. Climate change is a timed test, one of the first that our civilization has faced, and with each scientific report the window narrows.
  • The biggest oil companies might still be able to self-finance their continuing operations, but “the pure-play frackers will find finance impossible,” Buckley said. “Coal-dependent rail carriers and port owners and coal-mine contracting firms will all be hit.”
  • “the impacts of that social signal would be significant immediately, while the economic impacts from transitioning off of fossil fuels would happen over time.”
  • But four-fifths of the world’s population lives in nations that currently pay to import fossil fuels, and their economies would benefit, as ample financing would allow them to transition relatively quickly to low-cost solar and wind power.
  • n some ways, the insurance industry resembles the banks and the asset managers: it controls a huge pool of money and routinely invests enormous sums in the fossil-fuel industry.
  • Insurance companies are the part of our economy that we ask to understand risk, the ones with the data to really see what is happening as the climate changes, and for decades they’ve been churning out high-quality research establishing just how bad the crisis really is.
  • The second thing that makes insurance companies unique is that they don’t just provide money; they provide insurance. If you want to build a tar-sands pipeline or a coal-fired power plant or a liquefied-natural-gas export terminal, you need to get an insurance company to underwrite the plan.
  • But it’s both simple and powerful to switch your bank account: local credit unions and small-town banks are unlikely to be invested in fossil fuels,
  • Financial institutions can help with that work, but their main usefulness lies in helping to break the power of the fossil-fuel companies.
Javier E

Why a Belgian insurer studies the impact of air pollution on health | Air pollution | T... - 0 views

  • hy would a health insurance company study air pollution? Because, as Christian Horemans, an environment and health expert with the Belgian mutual insurer Mutualités Libres, explains: “It has an enormous impact on public health and an enormous cost for the Belgian compulsory health and disability insurance
  • The insurer’s recent study , carried out by a team that included Belgian and Dutch universities and research centres, compared health insurance claims from 1.2 million people in Belgium in 2019 with particle pollution in their neighbourhoods. The financial results were startling.
  • the study found associations between GP and hospital emergency visits and particle pollution (PM2.5), and when it applied the results to Belgium as a whole, researchers estimated that reducing air pollution to match the least polluted 25% of places in Belgium would have saved €43m (£37m) in GP and emergency hospital visits in 2019.
  • ...5 more annotations...
  • “The lower the concentrations of PM2.5, the fewer people need to see the general practitioner. So improving air quality not only benefits public health, but also ensures the financial sustainability of the social security system.”
  • The study also found that more tree and grass cover in a neighbourhood was associated with less demand for GP and hospital visits. This was especially true of cities and accords with wider evidence on green space in urban areas.
  • The World Health Organization already recommends that we have access to 0.5 hectares (1.2 acres) of public green space, about two-thirds the size of a football pitch, within 300 metres of our homes.
  • The urban forester Cecil Konijnendijk recently suggested a 3-30-300 rule of thumb for urban forestry and urban greening – every citizen should be able to see at least three trees (of a decent size) from their home, have 30% tree-canopy cover in their neighbourhood and not live more than 300 metres from a park or green space.
  • “Based on these results, we need more, not fewer initiatives, like Barcelona’s Superblocks. Political pressure to reverse urban planning initiatives that reduce air pollution and increase tree cover seem ridiculous.”
Javier E

How the Coronavirus Will Change Young People's Lives - The Atlantic - 0 views

  • Generation C includes more than just babies. Kids, college students, and those in their first post-graduation jobs are also uniquely vulnerable to short-term catastrophe. Recent history tells us that the people in this group could see their careers derailed, finances shattered, and social lives upended.
  • With many local businesses closed or viewed as potential vectors of disease, pandemic conditions have already funneled more money to Amazon and its large-scale competitors, including Walmart and Costco.
  • “Epidemics are really bad for economies,”
  • ...35 more annotations...
  • “We’re going to see a whole bunch of college graduates and people finishing graduate programs this summer who are going to really struggle to find work.”
  • People just starting out now, and those who will begin their adult lives in the years following the pandemic, will be asked to walk a financial tightrope with no practice and, for most, no safety net. Fewer of them will be able to turn to their parents or other family members for significant help
  • To gauge what’s in store for job-seekers, it might be most useful to look to a different, more recent kind of disaster: the 2008 financial collapse. More than a decade later, its effects are widely understood to have been catastrophic to the financial futures of those who were in their teens and 20s when it hit.
  • Not only did jobs dry up, but federal relief dollars mostly went to large employers such as banks and insurance companies instead of to workers themselves.
  • investors picked off dirt-cheap foreclosures to flip them for wealthier buyers or turn them into rentals, which has helped rising housing prices far outpace American wage growth.
  • Millennials, many of whom spent years twisting in the wind when, under better circumstances, they would have been setting down the professional and social foundations for stable lives, now have less money in savings than previous generations did at the same age. Relatively few of them have bought homes, married, or had children.
  • Just as the nation’s housing stock moved into the hands of fewer people during the Great Recession, small and medium-size businesses might suffer a similar fate after the pandemic, which could be a nightmare for the country’s labor force.
  • Schoolwork, it turns out, is hard to focus on during a slow-rolling global disaster.
  • American restaurants, which employ millions, have been devastated by quarantine restrictions, but national chains such as Papa John’s and Little Caesars are running television ads touting the virus-murdering temperatures of their commercial ovens,
  • The private-equity behemoth Bain Capital is making plans to gobble up desirable companies weakened by the pandemic. The effect could be a quick consolidation of capital, and the fewer companies that control the economy, the worse the economy generally is for workers and consumers.
  • Less competition means lower wages, higher prices, and conglomerates with enough political influence to stave off regulation that might force them to improve wages, worker safety, or job security.
  • as with virtually all problems, grad school is not the answer to whatever the coronavirus might do to your future.
  • there will be “definitely an increase” in people seeking education post-quarantine, taking advantage of loan availability to acquire expertise that might better position them to build a stable life.
  • those decisions have since worsened their economic strain, while not significantly improving professional outcomes.
  • Private universities may suddenly be too expensive, and frequent plane rides to faraway colleges might seem much riskier. Mass delays will affect things like school budgets and admissions for years, but in ways that are difficult to predict.
  • there is no precedent for a life-interrupting disaster of this scale in America’s current educational and professional structures.
  • What will become of Generation C?
  • Many types of classes don’t work particularly well via videochat, such as chemistry and ecology, which in normal times often ask students to participate in lab work or go out into the natural world.
  • “People with a resource base and finances and so forth, they’re going to get through this a whole lot easier than the families who don’t even have a computer for their children to attend school,”
  • Disasters, he told me, tend to illuminate and magnify existing disadvantages that are more easily ignored by those outside the affected communities during the course of everyday life.
  • Disasters also make clear when disadvantages—polluted neighborhoods, scarce local supplies of fresh fruits and vegetables, risky jobs—have accumulated over a lifetime, leaving some people far more vulnerable to catastrophe than others
  • Children in those communities already have a harder time accessing quality education and getting into college. Their future prospects look dimmer, now that they’re faced with technical and social obstacles and the trauma of watching family members and friends suffer and die during a pandemic.
  • in moments of great despair, people’s understanding of what’s possible shifts.
  • For that to translate to real change, though, it’s crucial that the reactions to the new world we live in be codified into policy. Clues to post-pandemic policy shifts lie in the kinds of political agitation that were already happening before the virus. “Things that already had some support are more likely to take seed,
  • This is where young people might finally be poised to take some control. The 2008 financial crisis appears to have pushed many Millennials leftward
  • When housing prices soared, wages stagnated, and access to basic health care became more scarce, many young people looked around at the richest nation in the world and wondered who was enjoying all the riches. Policies such as Medicare for All, debt cancellation, environmental protections, wealth taxes, criminal-justice reform, jobs programs, and other broad expansions of the social safety net have become rallying cries for young people who experience American life as a rigged game
  • the pandemic’s quick, brutal explication of the ways employment-based health care and loose labor laws have long hurt working people might make for a formative disaster all its own.
  • “There’s a possibility, particularly with who you’re calling Generation C, that their experience of the pandemic against a backdrop of profoundly fragmented politics could lead to some very necessary revolutionary change,”
  • The seeds of that change might have already been planted in the 2018 midterm elections, when young voters turned up in particularly high numbers and helped elect a group of younger, more progressive candidates both locally and nationally.
  • Younger people “aren’t saddled with Cold War imagery and rhetoric. It doesn’t have the same power over our imaginations,”
  • a subset of young voters believes that some American conservatives have cried wolf, deriding everything from public libraries to free doctor visits as creeping socialism until the word lost much of its power to scare.
  • the one-two punch of the Great Recession and the coronavirus pandemic—if handled poorly by those in power—might be enough to create a future America with free health care, a reformed justice system, and better labor protections for working people.
  • But winds of change rarely kick up debris of just one type. The Great Recession opened the minds of wide swaths of young Americans to left-leaning social programs, but its effects are also at least partially responsible for the Tea Party and the Trump presidency. The chaos of a pandemic opens the door for a stronger social safety net, but also for expanded authoritarianism.
  • Beyond politics and policy, the structures that young people have built on their own to endure the pandemic might change life after it, too. Young Americans have responded to the disaster with a wave of volunteerism, including Arora’s internship-information clearinghouse and mutual-aid groups across the country that deliver groceries to those in need.
  • As strong as people’s reactions are in the middle of a crisis, though, people tend to leave behind the traumatic lessons of a disaster as quickly as they can. “Amnesia sets in until the next crisis,” Schoch-Spana said. “Maybe this is different; maybe it’s big enough and disruptive enough that it changes what we imagine it takes to be safe in the world, so I don’t know
Javier E

Medical Mystery: Something Happened to U.S. Health Spending After 1980 - The New York T... - 0 views

  • The United States devotes a lot more of its economic resources to health care than any other nation, and yet its health care outcomes aren’t better for it.
  • That hasn’t always been the case. America was in the realm of other countries in per-capita health spending through about 1980. Then it diverged.
  • It’s the same story with health spending as a fraction of gross domestic product. Likewise, life expectancy. In 1980, the U.S. was right in the middle of the pack of peer nations in life expectancy at birth. But by the mid-2000s, we were at the bottom of the pack.
  • ...30 more annotations...
  • “Medical care is one of the less important determinants of life expectancy,” said Joseph Newhouse, a health economist at Harvard. “Socioeconomic status and other social factors exert larger influences on longevity.”
  • The United States has relied more on market forces, which have been less effective.
  • For spending, many experts point to differences in public policy on health care financing. “Other countries have been able to put limits on health care prices and spending” with government policies
  • One result: Prices for health care goods and services are much higher in the United States.
  • “The differential between what the U.S. and other industrialized countries pay for prescriptions and for hospital and physician services continues to widen over time,”
  • The degree of competition, or lack thereof, in the American health system plays a role
  • periods of rapid growth in U.S. health care spending coincide with rapid growth in markups of health care prices. This is what one would expect in markets with low levels of competition.
  • Although American health care markets are highly consolidated, which contributes to higher prices, there are also enough players to impose administrative drag. Rising administrative costs — like billing and price negotiations across many insurers — may also explain part of the problem.
  • The additional costs associated with many insurers, each requiring different billing documentation, adds inefficiency
  • “We have big pharma vs. big insurance vs. big hospital networks, and the patient and employers and also the government end up paying the bills,”
  • Though we have some large public health care programs, they are not able to keep a lid on prices. Medicare, for example, is forbidden to negotiate as a whole for drug prices,
  • once those spending constraints eased, “suppliers of medical inputs marketed very costly technological innovations with gusto,”
  • , all across the world, one sees constraints on payment, technology, etc., in the 1970s and 1980s,” he said. The United States is not different in kind, only degree; our constraints were weaker.
  • Mr. Starr suggests that the high inflation of the late 1970s contributed to growth in health care spending, which other countries had more systems in place to control
  • These are all highly valuable, but they came at very high prices. This willingness to pay more has in turn made the United States an attractive market for innovation in health care.
  • The last third of the 20th century or so was a fertile time for expensive health care innovation
  • being an engine for innovation doesn’t necessarily translate into better outcomes.
  • international differences in rates of smoking, obesity, traffic accidents and homicides cannot explain why Americans tend to die younger.
  • Some have speculated that slower American life expectancy improvements are a result of a more diverse population
  • But Ms. Glied and Mr. Muennig found that life expectancy growth has been higher in minority groups in the United States
  • even accounting for motor vehicle traffic crashes, firearm-related injuries and drug poisonings, the United States has higher mortality rates than comparably wealthy countries.
  • The lack of universal health coverage and less safety net support for low-income populations could have something to do with it
  • “The most efficient way to improve population health is to focus on those at the bottom,” she said. “But we don’t do as much for them as other countries.”
  • The effectiveness of focusing on low-income populations is evident from large expansions of public health insurance for pregnant women and children in the 1980s. There were large reductions in child mortality associated with these expansions.
  • A report by RAND shows that in 1980 the United States spent 11 percent of its G.D.P. on social programs, excluding health care, while members of the European Union spent an average of about 15 percent. In 2011 the gap had widened to 16 percent versus 22 percent.
  • “Social underfunding probably has more long-term implications than underinvestment in medical care,” he said. For example, “if the underspending is on early childhood education — one of the key socioeconomic determinants of health — then there are long-term implications.”
  • Slow income growth could also play a role because poorer health is associated with lower incomes. “It’s notable that, apart from the richest of Americans, income growth stagnated starting in the late 1970s,”
  • History demonstrates that it is possible for the U.S. health system to perform on par with other wealthy countries
  • That doesn’t mean it’s a simple matter to return to international parity. A lot has changed in 40 years. What began as small gaps in performance are now yawning chasms
  • “For starters, we could have a lot more competition in health care. And government programs should often pay less than they do.” He added that if savings could be reaped from these approaches, and others — and reinvested in improving the welfare of lower-income Americans — we might close both the spending and longevity gaps.
Javier E

How Public Health Took Part in Its Own Downfall - The Atlantic - 0 views

  • when the coronavirus pandemic reached the United States, it found a public-health system in disrepair. That system, with its overstretched staff, meager budgets, crumbling buildings, and archaic equipment, could barely cope with sickness as usual, let alone with a new, fast-spreading virus.
  • By one telling, public health was a victim of its own success, its value shrouded by the complacency of good health
  • By a different account, the competing field of medicine actively suppressed public health, which threatened the financial model of treating illness in (insured) individuals
  • ...27 more annotations...
  • In fact, “public health has actively participated in its own marginalization,” Daniel Goldberg, a historian of medicine at the University of Colorado, told me. As the 20th century progressed, the field moved away from the idea that social reforms were a necessary part of preventing disease and willingly silenced its own political voice. By swimming along with the changing currents of American ideology, it drowned many of the qualities that made it most effective.
  • Germ theory offered a seductive new vision for defeating disease: Although the old public health “sought the sources of infectious disease in the surroundings of man; the new finds them in man himself,” wrote Hibbert Hill in The New Public Health in 1913
  • “They didn’t have to think of themselves as activists,” Rosner said. “It was so much easier to identify individual victims of disease and cure them than it was to rebuild a city.”
  • As public health moved into the laboratory, a narrow set of professionals associated with new academic schools began to dominate the once-broad field. “It was a way of consolidating power: If you don’t have a degree in public health, you’re not public health,”
  • Mastering the new science of bacteriology “became an ideological marker,” sharply differentiating an old generation of amateurs from a new one of scientifically minded professionals,
  • Hospitals, meanwhile, were becoming the centerpieces of American health care, and medicine was quickly amassing money and prestige by reorienting toward biomedical research
  • Public health began to self-identify as a field of objective, outside observers of society instead of agents of social change. It assumed a narrower set of responsibilities that included data collection, diagnostic services for clinicians, disease tracing, and health education.
  • Assuming that its science could speak for itself, the field pulled away from allies such as labor unions, housing reformers, and social-welfare organizations that had supported city-scale sanitation projects, workplace reforms, and other ambitious public-health projects.
  • That left public health in a precarious position—still in medicine’s shadow, but without the political base “that had been the source of its power,”
  • After World War II, biomedicine lived up to its promise, and American ideology turned strongly toward individualism.
  • Seeing poor health as a matter of personal irresponsibility rather than of societal rot became natural.
  • Even public health began to treat people as if they lived in a social vacuum. Epidemiologists now searched for “risk factors,” such as inactivity and alcohol consumption, that made individuals more vulnerable to disease and designed health-promotion campaigns that exhorted people to change their behaviors, tying health to willpower in a way that persists today.
  • This approach appealed, too, to powerful industries with an interest in highlighting individual failings rather than the dangers of their products.
  • “epidemiology isn’t a field of activists saying, ‘God, asbestos is terrible,’ but of scientists calculating the statistical probability of someone’s death being due to this exposure or that one.”
  • In 1971, Paul Cornely, then the president of the APHA and the first Black American to earn a Ph.D. in public health, said that “if the health organizations of this country have any concern about the quality of life of its citizens, they would come out of their sterile and scientific atmosphere and jump in the polluted waters of the real world where action is the basis for survival.”
  • a new wave of “social epidemiologists” once again turned their attention to racism, poverty, and other structural problems.
  • The biomedical view of health still dominates, as evidenced by the Biden administration’s focus on vaccines at the expense of masks, rapid tests, and other “nonpharmaceutical interventions.”
  • Public health has often been represented by leaders with backgrounds primarily in clinical medicine, who have repeatedly cast the pandemic in individualist terms: “Your health is in your own hands,” said the CDC’s director, Rochelle Walensky, in May
  • the pandemic has proved what public health’s practitioners understood well in the late 19th and early 20th century: how important the social side of health is. People can’t isolate themselves if they work low-income jobs with no paid sick leave, or if they live in crowded housing or prisons.
  • Public health is now trapped in an unenviable bind. “If it conceives of itself too narrowly, it will be accused of lacking vision … If it conceives of itself too expansively, it will be accused of overreaching,
  • “Public health gains credibility from its adherence to science, and if it strays too far into political advocacy, it may lose the appearance of objectivity,”
  • In truth, public health is inescapably political, not least because it “has to make decisions in the face of rapidly evolving and contested evidence,” Fairchild told me. That evidence almost never speaks for itself, which means the decisions that arise from it must be grounded in values.
  • Those values, Fairchild said, should include equity and the prevention of harm to others, “but in our history, we lost the ability to claim these ethical principles.”
  • “Sick-leave policies, health-insurance coverage, the importance of housing … these things are outside the ability of public health to implement, but we should raise our voices about them,” said Mary Bassett, of Harvard, who was recently appointed as New York’s health commissioner. “I think we can get explicit.”
  • The future might lie in reviving the past, and reopening the umbrella of public health to encompass people without a formal degree or a job at a health department.
  • What if, instead, we thought of the Black Lives Matter movement as a public-health movement, the American Rescue Plan as a public-health bill, or decarceration, as the APHA recently stated, as a public-health goal? In this way of thinking, too, employers who institute policies that protect the health of their workers are themselves public-health advocates.
  • “We need to re-create alliances with others and help them to understand that what they are doing is public health,
Javier E

Ozempic or Bust - The Atlantic - 0 views

  • June 2024 Issue
  • Explore
  • it is impossible to know, in the first few years of any novel intervention, whether its success will last.
  • ...77 more annotations...
  • The ordinary fixes—the kind that draw on people’s will, and require eating less and moving more—rarely have a large or lasting effect. Indeed, America itself has suffered through a long, maddening history of failed attempts to change its habits on a national scale: a yo-yo diet of well-intentioned treatments, policies, and other social interventions that only ever lead us back to where we started
  • Through it all, obesity rates keep going up; the diabetes epidemic keeps worsening.
  • The most recent miracle, for Barb as well as for the nation, has come in the form of injectable drugs. In early 2021, the Danish pharmaceutical company Novo Nordisk published a clinical trial showing remarkable results for semaglutide, now sold under the trade names Wegovy and Ozempic.
  • Patients in the study who’d had injections of the drug lost, on average, close to 15 percent of their body weight—more than had ever been achieved with any other drug in a study of that size. Wadden knew immediately that this would be “an incredible revolution in the treatment of obesity.”
  • Many more drugs are now racing through development: survodutide, pemvidutide, retatrutide. (Among specialists, that last one has produced the most excitement: An early trial found an average weight loss of 24 percent in one group of participants.
  • In the United States, an estimated 189 million adults are classified as having obesity or being overweight
  • The drugs don’t work for everyone. Their major side effects—nausea, vomiting, and diarrhea—can be too intense for many patients. Others don’t end up losing any weight
  • For the time being, just 25 percent of private insurers offer the relevant coverage, and the cost of treatment—about $1,000 a month—has been prohibitive for many Americans.
  • The drugs have already been approved not just for people with diabetes or obesity, but for anyone who has a BMI of more than 27 and an associated health condition, such as high blood pressure or cholesterol. By those criteria, more than 140 million American adults already qualify
  • if this story goes the way it’s gone for other “risk factor” drugs such as statins and antihypertensives, then the threshold for prescriptions will be lowered over time, inching further toward the weight range we now describe as “normal.”
  • How you view that prospect will depend on your attitudes about obesity, and your tolerance for risk
  • The first GLP-1 drug to receive FDA approval, exenatide, has been used as a diabetes treatment for more than 20 years. No long-term harms have been identified—but then again, that drug’s long-term effects have been studied carefully only across a span of seven years
  • the data so far look very good. “These are now being used, literally, in hundreds of thousands of people across the world,” she told me, and although some studies have suggested that GLP-1 drugs may cause inflammation of the pancreas, or even tumor growth, these concerns have not borne out.
  • adolescents are injecting newer versions of these drugs, and may continue to do so every week for 50 years or more. What might happen over all that time?
  • “All of us, in the back of our minds, always wonder, Will something show up?  ” Although no serious problems have yet emerged, she said, “you wonder, and you worry.”
  • in light of what we’ve been through, it’s hard to see what other choices still remain. For 40 years, we’ve tried to curb the spread of obesity and its related ailments, and for 40 years, we’ve failed. We don’t know how to fix the problem. We don’t even understand what’s really causing it. Now, again, we have a new approach. This time around, the fix had better work.
  • The fen-phen revolution arrived at a crucial turning point for Wadden’s field, and indeed for his career. By then he’d spent almost 15 years at the leading edge of research into dietary interventions, seeing how much weight a person might lose through careful cutting of their calories.
  • But that sort of diet science—and the diet culture that it helped support—had lately come into a state of ruin. Americans were fatter than they’d ever been, and they were giving up on losing weight. According to one industry group, the total number of dieters in the country declined by more than 25 percent from 1986 to 1991.
  • Rejecting diet culture became something of a feminist cause. “A growing number of women are joining in an anti-diet movement,” The New York Times reported in 1992. “They are forming support groups and ceasing to diet with a resolve similar to that of secretaries who 20 years ago stopped getting coffee for their bosses.
  • Now Wadden and other obesity researchers were reaching a consensus that behavioral interventions might produce in the very best scenario an average lasting weight loss of just 5 to 10 percent
  • National surveys completed in 1994 showed that the adult obesity rate had surged by more than half since 1980, while the proportion of children classified as overweight had doubled. The need for weight control in America had never seemed so great, even as the chances of achieving it were never perceived to be so small.
  • Wadden wasn’t terribly concerned, because no one in his study had reported any heart symptoms. But ultrasounds revealed that nearly one-third of them had some degree of leakage in their heart valves. His “cure for obesity” was in fact a source of harm.
  • In December 1994, the Times ran an editorial on what was understood to be a pivotal discovery: A genetic basis for obesity had finally been found. Researchers at Rockefeller University were investigating a molecule, later named leptin, that gets secreted from fat cells and travels to the brain, and that causes feelings of satiety. Lab mice with mutations in the leptin gene—importantly, a gene also found in humans—overeat until they’re three times the size of other mice. “The finding holds out the dazzling hope,”
  • In April 1996, the doctors recommended yes: Dexfenfluramine was approved—and became an instant blockbuster. Patients received prescriptions by the hundreds of thousands every month. Sketchy wellness clinics—call toll-free, 1-888-4FEN-FEN—helped meet demand. Then, as now, experts voiced concerns about access. Then, as now, they worried that people who didn’t really need the drugs were lining up to take them. By the end of the year, sales of “fen” alone had surpassed $300 million.
  • It was nothing less than an awakening, for doctors and their patients alike. Now a patient could be treated for excess weight in the same way they might be treated for diabetes or hypertension—with a drug they’d have to take for the rest of their life.
  • the article heralded a “new understanding of obesity as a chronic disease rather than a failure of willpower.”
  • News had just come out that, at the Mayo Clinic in Minnesota, two dozen women taking fen-phen—including six who were, like Barb, in their 30s—had developed cardiac conditions. A few had needed surgery, and on the operating table, doctors discovered that their heart valves were covered with a waxy plaque.
  • Americans had been prescribed regular fenfluramine since 1973, and the newer drug, dexfenfluramine, had been available in France since 1985. Experts took comfort in this history. Using language that is familiar from today’s assurances regarding semaglutide and other GLP-1 drugs, they pointed out that millions were already on the medication. “It is highly unlikely that there is anything significant in toxicity to the drug that hasn’t been picked up with this kind of experience,” an FDA official named James Bilstad would later say in a Time cover story headlined “The Hot New Diet Pill.
  • “I know I can’t get any more,” she told Williams. “I have to use up what I have. And then I don’t know what I’m going to do after that. That’s the problem—and that is what scares me to death.” Telling people to lose weight the “natural way,” she told another guest, who was suggesting that people with obesity need only go on low-carb diets, is like “asking a person with a thyroid condition to just stop their medication.”
  • She’d gone off the fen-phen and had rapidly regained weight. “The voices returned and came back in a furor I’d never heard before,” Barb later wrote on her blog. “It was as if they were so angry at being silenced for so long, they were going to tell me 19 months’ worth of what they wanted me to hear. I was forced to listen. And I ate. And I ate. And ate.”
  • For Barb, rapid weight loss has brought on a different metaphysical confusion. When she looks in the mirror, she sometimes sees her shape as it was two years ago. In certain corners of the internet, this is known as “phantom fat syndrome,” but Barb dislikes that term. She thinks it should be called “body integration syndrome,” stemming from a disconnect between your “larger-body memory” and “smaller-body reality.
  • In 2003, the U.S. surgeon general declared obesity “the terror within, a threat that is every bit as real to America as the weapons of mass destruction”; a few months later, Eric Finkelstein, an economist who studies the social costs of obesity, put out an influential paper finding that excess weight was associated with up to $79 billion in health-care spending in 1998, of which roughly half was paid by Medicare and Medicaid. (Later he’d conclude that the number had nearly doubled in a decade.
  • In 2004, Finkelstein attended an Action on Obesity summit hosted by the Mayo Clinic, at which numerous social interventions were proposed, including calorie labeling in workplace cafeterias and mandatory gym class for children of all grades.
  • he message at their core, that soda was a form of poison like tobacco, spread. In San Francisco and New York, public-service campaigns showed images of soda bottles pouring out a stream of glistening, blood-streaked fat. Michelle Obama led an effort to depict water—plain old water—as something “cool” to drink.
  • Soon, the federal government took up many of the ideas that Brownell had helped popularize. Barack Obama had promised while campaigning for president that if America’s obesity trends could be reversed, the Medicare system alone would save “a trillion dollars.” By fighting fat, he implied, his ambitious plan for health-care reform would pay for itself. Once he was in office, his administration pulled every policy lever it could.
  • Michelle Obama helped guide these efforts, working with marketing experts to develop ways of nudging kids toward better diets and pledging to eliminate “food deserts,” or neighborhoods that lacked convenient access to healthy, affordable food. She was relentless in her public messaging; she planted an organic garden at the White House and promoted her signature “Let’s Move!” campaign around the country.
  • An all-out war on soda would come to stand in for these broad efforts. Nutrition studies found that half of all Americans were drinking sugar-sweetened beverages every day, and that consumption of these accounted for one-third of the added sugar in adults’ diets. Studies turned up links between people’s soft-drink consumption and their risks for type 2 diabetes and obesity. A new strand of research hinted that “liquid calories” in particular were dangerous to health.
  • when their field lost faith in low-calorie diets as a source of lasting weight loss, the two friends went in opposite directions. Wadden looked for ways to fix a person’s chemistry, so he turned to pharmaceuticals. Brownell had come to see obesity as a product of our toxic food environment: He meant to fix the world to which a person’s chemistry responded, so he started getting into policy.
  • The social engineering worked. Slowly but surely, Americans’ lamented lifestyle began to shift. From 2001 to 2018, added-sugar intake dropped by about one-fifth among children, teens, and young adults. From the late 1970s through the early 2000s, the obesity rate among American children had roughly tripled; then, suddenly, it flattened out.
  • although the obesity rate among adults was still increasing, its climb seemed slower than before. Americans’ long-standing tendency to eat ever-bigger portions also seemed to be abating.
  • sugary drinks—liquid candy, pretty much—were always going to be a soft target for the nanny state. Fixing the food environment in deeper ways proved much harder. “The tobacco playbook pretty much only works for soda, because that’s the closest analogy we have as a food item,
  • that tobacco playbook doesn’t work to increase consumption of fruits and vegetables, he said. It doesn’t work to increase consumption of beans. It doesn’t work to make people eat more nuts or seeds or extra-virgin olive oil.
  • Careful research in the past decade has shown that many of the Obama-era social fixes did little to alter behavior or improve our health. Putting calorie labels on menus seemed to prompt at most a small decline in the amount of food people ate. Employer-based wellness programs (which are still offered by 80 percent of large companies) were shown to have zero tangible effects. Health-care spending, in general, kept going up.
  • From the mid-1990s to the mid-2000s, the proportion of adults who said they’d experienced discrimination on account of their height or weight increased by two-thirds, going up to 12 percent. Puhl and others started citing evidence that this form of discrimination wasn’t merely a source of psychic harm, but also of obesity itself. Studies found that the experience of weight discrimination is associated with overeating, and with the risk of weight gain over time.
  • obesity rates resumed their ascent. Today, 20 percent of American children have obesity. For all the policy nudges and the sensible revisions to nutrition standards, food companies remain as unfettered as they were in the 1990s, Kelly Brownell told me. “Is there anything the industry can’t do now that it was doing then?” he asked. “The answer really is no. And so we have a very predictable set of outcomes.”
  • she started to rebound. The openings into her gastric pouch—the section of her stomach that wasn’t bypassed—stretched back to something like their former size. And Barb found ways to “eat around” the surgery, as doctors say, by taking food throughout the day in smaller portions
  • Bariatric surgeries can be highly effective for some people and nearly useless for others. Long-term studies have found that 30 percent of those who receive the same procedure Barb did regain at least one-quarter of what they lost within two years of reaching their weight nadir; more than half regain that much within five years.
  • if the effects of Barb’s surgery were quickly wearing off, its side effects were not: She now had iron, calcium, and B12 deficiencies resulting from the changes to her gut. She looked into getting a revision of the surgery—a redo, more or less—but insurance wouldn’t cover it
  • She found that every health concern she brought to doctors might be taken as a referendum, in some way, on her body size. “If I stubbed my toe or whatever, they’d just say ‘Lose weight.’ ” She began to notice all the times she’d be in a waiting room and find that every chair had arms. She realized that if she was having a surgical procedure, she’d need to buy herself a plus-size gown—or else submit to being covered with a bedsheet when the nurses realized that nothing else would fit.
  • Barb grew angrier and more direct about her needs—You’ll have to find me a different chair, she started saying to receptionists. Many others shared her rage. Activists had long decried the cruel treatment of people with obesity: The National Association to Advance Fat Acceptance had existed, for example, in one form or another, since 1969; the Council on Size & Weight Discrimination had been incorporated in 1991. But in the early 2000s, the ideas behind this movement began to wend their way deeper into academia, and they soon gained some purchase with the public.
  • “Our public-health efforts to address obesity have failed,” Eric Finkelstein, the economist, told me.
  • Others attacked the very premise of a “healthy weight”: People do not have any fundamental need, they argued, morally or medically, to strive for smaller bodies as an end in itself. They called for resistance to the ideology of anti-fatness, with its profit-making arms in health care and consumer goods. The Association for Size Diversity and Health formed in 2003; a year later, dozens of scholars working on weight-related topics joined together to create the academic field of fat studies.
  • As the size-diversity movement grew, its values were taken up—or co-opted—by Big Business. Dove had recently launched its “Campaign for Real Beauty,” which included plus-size women. (Ad Age later named it the best ad campaign of the 21st century.) People started talking about “fat shaming” as something to avoid
  • By 2001, Bacon, who uses they/them pronouns, had received their Ph.D. and finished a rough draft of a book, Health at Every Size, which drew inspiration from a broader movement by that name among health-care practitioners
  • But something shifted in the ensuing years. In 2007, Bacon got a different response, and the book was published. Health at Every Size became a point of entry for a generation of young activists and, for a time, helped shape Americans’ understanding of obesity.
  • Some experts were rethinking their advice on food and diet. At UC Davis, a physiologist named Lindo Bacon who had struggled to overcome an eating disorder had been studying the effects of “intuitive eating,” which aims to promote healthy, sustainable behavior without fixating on what you weigh or how you look
  • The heightened sensitivity started showing up in survey data, too. In 2010, fewer than half of U.S. adults expressed support for giving people with obesity the same legal protections from discrimination offered to people with disabilities. In 2015, that rate had risen to three-quarters.
  • In Bacon’s view, the 2000s and 2010s were glory years. “People came together and they realized that they’re not alone, and they can start to be critical of the ideas that they’ve been taught,” Bacon told me. “We were on this marvelous path of gaining more credibility for the whole Health at Every Size movement, and more awareness.”
  • that sense of unity proved short-lived; the movement soon began to splinter. Black women have the highest rates of obesity, and disproportionately high rates of associated health conditions. Yet according to Fatima Cody Stanford, an obesity-medicine physician at Harvard Medical School, Black patients with obesity get lower-quality care than white patients with obesity.
  • That system was exactly what Bacon and the Health at Every Size movement had set out to reform. The problem, as they saw it, was not so much that Black people lacked access to obesity medicine, but that, as Bacon and the Black sociologist Sabrina Strings argued in a 2020 article, Black women have been “specifically targeted” for weight loss, which Bacon and Strings saw as a form of racism
  • But members of the fat-acceptance movement pointed out that their own most visible leaders, including Bacon, were overwhelmingly white. “White female dietitians have helped steal and monetize the body positive movement,” Marquisele Mercedes, a Black activist and public-health Ph.D. student, wrote in September 2020. “And I’m sick of it.”
  • Tensions over who had the standing to speak, and on which topics, boiled over. In 2022, following allegations that Bacon had been exploitative and condescending toward Black colleagues, the Association for Size Diversity and Health expelled them from its ranks and barred them from attending its events.
  • As the movement succumbed to in-fighting, its momentum with the public stalled. If attitudes about fatness among the general public had changed during the 2000s and 2010s, it was only to a point. The idea that some people can indeed be “fit but fat,” though backed up by research, has always been a tough sell.
  • Although Americans had become less inclined to say they valued thinness, measures of their implicit attitudes seemed fairly stable. Outside of a few cities such as San Francisco and Madison, Wisconsin, new body-size-discrimination laws were never passed.
  • In the meantime, thinness was coming back into fashion
  • In the spring of 2022, Kim Kardashian—whose “curvy” physique has been a media and popular obsession—boasted about crash-dieting in advance of the Met Gala. A year later, the model and influencer Felicity Hayward warned Vogue Business that “plus-size representation has gone backwards.” In March of this year, the singer Lizzo, whose body pride has long been central to her public persona, told The New York Times that she’s been trying to lose weight. “I’m not going to lie and say I love my body every day,” she said.
  • Among the many other dramatic effects of the GLP-1 drugs, they may well have released a store of pent-up social pressure to lose weight.
  • If ever there was a time to debate that impulse, and to question its origins and effects, it would be now. But Puhl told me that no one can even agree on which words are inoffensive. The medical field still uses obesity, as a description of a diagnosable disease. But many activists despise that phrase—some spell it with an asterisk in place of the e—and propose instead to reclaim fat.
  • Everyone seems to agree on the most important, central fact: that we should be doing everything we can to limit weight stigma. But that hasn’t been enough to stop the arguing.
  • Things feel surreal these days to just about anyone who has spent years thinking about obesity. At 71, after more than four decades in the field, Thomas Wadden now works part-time, seeing patients just a few days a week. But the arrival of the GLP-1 drugs has kept him hanging on for a few more years, he said. “It’s too much of an exciting period to leave obesity research right now.”
  • When everyone is on semaglutide or tirzepatide, will the soft-drink companies—Brownell’s nemeses for so many years—feel as if a burden has been lifted? “My guess is the food industry is probably really happy to see these drugs come along,” he said. They’ll find a way to reach the people who are taking GLP‑1s, with foods and beverages in smaller portions, maybe. At the same time, the pressures to cut back on where and how they sell their products will abate.
  • the triumph in obesity treatment only highlights the abiding mystery of why Americans are still getting fatter, even now
  • Perhaps one can lay the blame on “ultraprocessed” foods, he said. Maybe it’s a related problem with our microbiomes. Or it could be that obesity, once it takes hold within a population, tends to reproduce itself through interactions between a mother and a fetus. Others have pointed to increasing screen time, how much sleep we get, which chemicals are in the products that we use, and which pills we happen to take for our many other maladies.
  • “The GLP-1s are just a perfect example of how poorly we understand obesity,” Mozaffarian told me. “Any explanation of why they cause weight loss is all post-hoc hand-waving now, because we have no idea. We have no idea why they really work and people are losing weight.”
  • The new drugs—and the “new understanding of obesity” that they have supposedly occasioned—could end up changing people’s attitudes toward body size. But in what ways
  • When the American Medical Association declared obesity a disease in 2013, Rebecca Puhl told me, some thought “it might reduce stigma, because it was putting more emphasis on the uncontrollable factors that contribute to obesity.” Others guessed that it would do the opposite, because no one likes to be “diseased.”
  • why wasn’t there another kind of nagging voice that wouldn’t stop—a sense of worry over what the future holds? And if she wasn’t worried for herself, then what about for Meghann or for Tristan, who are barely in their 40s? Wouldn’t they be on these drugs for another 40 years, or even longer? But Barb said she wasn’t worried—not at all. “The technology is so much better now.” If any problems come up, the scientists will find solutions.
brookegoodman

Fact check: Democratic presidential debate with Biden vs. Sanders - CNNPolitics - 0 views

  • Washington (CNN)Welcome to CNN's fact check coverage of the eleventh Democratic presidential debate from Washington, DC, ahead of the nation's third super Tuesday, where primaries will be held in Arizona, Florida, Illinois and Ohio on March 17.
  • As Vice President, Biden campaigned with New York Democratic Gov. Andrew Cuomo in 2015 to increase the state minimum wage to $15 an hour.
  • Asked whether he would order a national lockdown to combat the coronavirus pandemic, Biden took a swipe at Sanders' "Medicare for All" proposal. He pointed to Italy, saying that its single-payer health care system hasn't worked to stem the outbreak there.
  • ...20 more annotations...
  • Facts First: This is partly true. As the experience of Italy and other countries shows, having universal coverage and a government-run health system is not enough on its own to stem the spread of coronavirus. But the US is at a disadvantage in fighting the coronavirus because tens of millions of Americans are uninsured or face high out-of-pocket costs before their insurance kicks in -- which may make people hesitant to seek testing or treatment.
  • "Addressing coronavirus with tens of millions of people without health insurance or with inadequate insurance will be a uniquely American challenge among developed countries," tweeted Larry Levitt, executive vice president for health policy at Kaiser. "It will take money to treat people and address uncompensated care absorbed by providers."
  • President Donald Trump has tweeted his support of the package. The Senate is expected to take up the measure when it returns to session this week.
  • Laboratories in Germany developed tests to detect the coronavirus which the WHO adopted and by last week, the WHO sent out tests to 120 countries. Other countries, like the US and China, chose to develop their own tests, according to the Washington Post.
  • On February 12, the Center for Disease Control reported that some of the coronavirus test kits shipped to labs across the country were not working as they should.
  • Dr. Anthony Fauci, Director of the National Institute of Allergy and Infectious Diseases and one of the experts leading the administration's response to the coronavirus told Congress Thursday that the US was "failing" when it came to getting Americans tested.
  • In an exchange about how the government bailed out banks during the 2008 financial crisis, Biden asserted that Sanders voted against a bailout for the auto industry.
  • Facts First: Sanders is right, but this needs context. Sanders voted for a bill that would have bailed out the auto industry -- but it failed to pass the Senate. He voted against a different bailout measure, the $700 billion Troubled Asset Relief Program, or TARP, which passed. That program released money to banks -- and a portion of that money eventually went to automakers.
  • Sanders on Sunday cited two figures about the number of people he claimed die because of the inadequacy of the US health care system.
  • Facts First: The true number of Americans who die because they are uninsured or lack adequate coverage is not known. Some studies suggest the number is in the tens of thousands per year, but other experts have expressed skepticism that the number is as high as Sanders says.
  • Biden, who was a US senator at the time of his vote, responded, "I learned that I can't take the word of a President when in fact they assured me that they would not use force. Remember the context. The context was the United Nations Security Council was going to vote to insist that we allow inspectors into determining whether or not...they were, in fact, producing nuclear weapons or weapons of mass destruction. They were not."
  • Facts First: Biden's claim is misleading by omission. Biden was an advocate of ending the Saddam Hussein regime for more than a year before the war began in 2003. While Biden did begin calling his 2002 vote a "mistake" in 2005, he was a public supporter of the war in 2003 and 2004 -- and he made clear in 2002 and 2003, both before and after the war started, that he had known he was voting to authorize a possible war, not only to try to get inspectors into Iraq. It's also unclear whether Bush ever made Biden any kind of promise related to the use of force.
  • During an exchange about Sanders' views on authoritarian countries, Biden claimed that China's income gains have been "marginal."
  • One way to measure standard of living is through a country's gross domestic product per capita at purchasing power parity. In other words, looking at a country's GDP per person in international dollars, a hypothetical currency used to measure purchasing parity between different countries.
  • Fact First: This Sanders' claim needs a lot of context. Biden did repeatedly support freezes in Social Security spending and at times called for raising the retirement age. In 2011, he said "changes" would have to be made to entitlements, saying they wouldn't be sustainable -- but he didn't specify what changes. Overall, the claim leaves out that Biden was typically talking about any changes to entitlements in the context of a broader legislative package.
  • And comments Biden made during a 1995 speech on the Senate floor show he was willing to make cuts to Medicare, but only as part of a broader deal that did not advocate cuts as big as Republicans want.
  • "If we are serious about saving Social Security, not raising taxes on the middle class, and not cutting back on benefits desperately needed by many senior citizens, we must adjust this artificial ceiling on Social Security taxes and make the Social Security tax more progressive."
  • Biden said upon his June 2019 reversal that he made "no apologies" for his past support of the amendment. He argued that "times have changed," since, he argued, the right to choose "was not under attack as it is now" from Republicans and since "women's rights and women's health are under assault like we haven't seen in the last 50 years."
  • Facts First: While it's unclear which ad Sanders was referring to, at least one super PAC connected to Biden, Unite the Country, ran a large television ad campaign that implicitly criticized Sanders without mentioning him by name..
  • For instance, it includes a clip from a Biden speech, in which Biden says" Democrats want a nominee who's a Democrat" -- an apparent challenge to the party bonafides of Sanders, who serves as an independent in the US Senate and describes himself as a democratic socialist.
1 - 20 of 126 Next › Last »
Showing 20 items per page