Skip to main content

Home/ History Readings/ Group items tagged public health

Rss Feed Group items tagged

Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

How Public Health Took Part in Its Own Downfall - The Atlantic - 0 views

  • when the coronavirus pandemic reached the United States, it found a public-health system in disrepair. That system, with its overstretched staff, meager budgets, crumbling buildings, and archaic equipment, could barely cope with sickness as usual, let alone with a new, fast-spreading virus.
  • By one telling, public health was a victim of its own success, its value shrouded by the complacency of good health
  • By a different account, the competing field of medicine actively suppressed public health, which threatened the financial model of treating illness in (insured) individuals
  • ...27 more annotations...
  • In fact, “public health has actively participated in its own marginalization,” Daniel Goldberg, a historian of medicine at the University of Colorado, told me. As the 20th century progressed, the field moved away from the idea that social reforms were a necessary part of preventing disease and willingly silenced its own political voice. By swimming along with the changing currents of American ideology, it drowned many of the qualities that made it most effective.
  • Germ theory offered a seductive new vision for defeating disease: Although the old public health “sought the sources of infectious disease in the surroundings of man; the new finds them in man himself,” wrote Hibbert Hill in The New Public Health in 1913
  • “They didn’t have to think of themselves as activists,” Rosner said. “It was so much easier to identify individual victims of disease and cure them than it was to rebuild a city.”
  • As public health moved into the laboratory, a narrow set of professionals associated with new academic schools began to dominate the once-broad field. “It was a way of consolidating power: If you don’t have a degree in public health, you’re not public health,”
  • Mastering the new science of bacteriology “became an ideological marker,” sharply differentiating an old generation of amateurs from a new one of scientifically minded professionals,
  • Hospitals, meanwhile, were becoming the centerpieces of American health care, and medicine was quickly amassing money and prestige by reorienting toward biomedical research
  • Public health began to self-identify as a field of objective, outside observers of society instead of agents of social change. It assumed a narrower set of responsibilities that included data collection, diagnostic services for clinicians, disease tracing, and health education.
  • Assuming that its science could speak for itself, the field pulled away from allies such as labor unions, housing reformers, and social-welfare organizations that had supported city-scale sanitation projects, workplace reforms, and other ambitious public-health projects.
  • That left public health in a precarious position—still in medicine’s shadow, but without the political base “that had been the source of its power,”
  • After World War II, biomedicine lived up to its promise, and American ideology turned strongly toward individualism.
  • Seeing poor health as a matter of personal irresponsibility rather than of societal rot became natural.
  • Even public health began to treat people as if they lived in a social vacuum. Epidemiologists now searched for “risk factors,” such as inactivity and alcohol consumption, that made individuals more vulnerable to disease and designed health-promotion campaigns that exhorted people to change their behaviors, tying health to willpower in a way that persists today.
  • This approach appealed, too, to powerful industries with an interest in highlighting individual failings rather than the dangers of their products.
  • “epidemiology isn’t a field of activists saying, ‘God, asbestos is terrible,’ but of scientists calculating the statistical probability of someone’s death being due to this exposure or that one.”
  • In 1971, Paul Cornely, then the president of the APHA and the first Black American to earn a Ph.D. in public health, said that “if the health organizations of this country have any concern about the quality of life of its citizens, they would come out of their sterile and scientific atmosphere and jump in the polluted waters of the real world where action is the basis for survival.”
  • a new wave of “social epidemiologists” once again turned their attention to racism, poverty, and other structural problems.
  • The biomedical view of health still dominates, as evidenced by the Biden administration’s focus on vaccines at the expense of masks, rapid tests, and other “nonpharmaceutical interventions.”
  • Public health has often been represented by leaders with backgrounds primarily in clinical medicine, who have repeatedly cast the pandemic in individualist terms: “Your health is in your own hands,” said the CDC’s director, Rochelle Walensky, in May
  • the pandemic has proved what public health’s practitioners understood well in the late 19th and early 20th century: how important the social side of health is. People can’t isolate themselves if they work low-income jobs with no paid sick leave, or if they live in crowded housing or prisons.
  • Public health is now trapped in an unenviable bind. “If it conceives of itself too narrowly, it will be accused of lacking vision … If it conceives of itself too expansively, it will be accused of overreaching,
  • “Public health gains credibility from its adherence to science, and if it strays too far into political advocacy, it may lose the appearance of objectivity,”
  • In truth, public health is inescapably political, not least because it “has to make decisions in the face of rapidly evolving and contested evidence,” Fairchild told me. That evidence almost never speaks for itself, which means the decisions that arise from it must be grounded in values.
  • Those values, Fairchild said, should include equity and the prevention of harm to others, “but in our history, we lost the ability to claim these ethical principles.”
  • “Sick-leave policies, health-insurance coverage, the importance of housing … these things are outside the ability of public health to implement, but we should raise our voices about them,” said Mary Bassett, of Harvard, who was recently appointed as New York’s health commissioner. “I think we can get explicit.”
  • The future might lie in reviving the past, and reopening the umbrella of public health to encompass people without a formal degree or a job at a health department.
  • What if, instead, we thought of the Black Lives Matter movement as a public-health movement, the American Rescue Plan as a public-health bill, or decarceration, as the APHA recently stated, as a public-health goal? In this way of thinking, too, employers who institute policies that protect the health of their workers are themselves public-health advocates.
  • “We need to re-create alliances with others and help them to understand that what they are doing is public health,
Javier E

The nation's public health agencies are ailing when they're needed most - The Washingto... - 0 views

  • At the very moment the United States needed its public health infrastructure the most, many local health departments had all but crumbled, proving ill-equipped to carry out basic functions let alone serve as the last line of defense against the most acute threat to the nation’s health in generations.
  • Epidemiologists, academics and local health officials across the country say the nation’s public health system is one of many weaknesses that continue to leave the United States poorly prepared to handle the coronavirus pandemic
  • That system lacks financial resources. It is losing staff by the day.
  • ...31 more annotations...
  • Even before the pandemic struck, local public health agencies had lost almost a quarter of their overall workforce since 2008 — a reduction of almost 60,000 workers
  • The agencies’ main source of federal funding — the Centers for Disease Control and Prevention’s emergency preparedness budget — had been cut 30 percent since 2003. The Trump administration had proposed slicing even deeper.
  • According to David Himmelstein of the CUNY School of Public Health, global consensus is that, at minimum, 6 percent of a nation’s health spending should be devoted to public health efforts. The United States, he said, has never spent more than half that much.
  • the problems have been left to fester.
  • Delaware County, Pa., a heavily populated Philadelphia suburb, did not even have a public health department when the pandemic struck and had to rely on a neighbor to mount a response.
  • With plunging tax receipts straining local government budgets, public health agencies confront the possibility of further cuts in an economy gutted by the coronavirus. It is happening at a time when health departments are being asked to do more than ever.
  • While the country spends roughly $3.6 trillion every year on health, less than 3 percent of that spending goes to public health and prevention
  • “Why an ongoing government function should depend on episodic grants rather than consistent funding, I don’t know,” he added. “That would be like seeing that the military is going to apply for a grant for its regular ongoing activities.”
  • Compared with Canada, the United Kingdom and northern European countries, the United States — with a less generous social safety net and no universal health care — is investing less in a system that its people rely on more.
  • Himmelstein said that the United States has never placed much emphasis on public health spending but that the investment began to decline even further in the early 2000s. The Great Recession fueled further cuts.
  • Plus, the U.S. public health system relies heavily on federal grants.
  • “That’s the way we run much of our public health activity for local health departments. You apply to the CDC, which is the major conduit for federal funding to state and local health departments,” Himmelstein said. “You apply to them for funding for particular functions, and if you don’t get the grant, you don’t have the funding for that.”
  • Many public health officials say a lack of a national message and approach to the pandemic has undermined their credibility and opened them up to criticism.
  • Few places were less prepared for covid-19’s arrival than Delaware County, Pa., where Republican leaders had decided they did not need a public health department at all
  • At the same time, many countries that invest more in public health infrastructure also provide universal medical coverage that enables them to provide many common public health services as part of their main health-care-delivery system.
  • Taylor and other elected officials worked out a deal with neighboring Chester County in which Delaware County paid affluent Chester County’s health department to handle coronavirus operations for both counties for now.
  • One reason health departments are so often neglected is their work focuses on prevention — of outbreaks, sexually transmitted diseases, smoking-related illnesses. Local health departments describe a frustrating cycle: The more successful they are, the less visible problems are and the less funding they receive. Often, that sets the stage for problems to explode again — as infectious diseases often do.
  • It has taken years for many agencies to rebuild budgets and staffing from deep cuts made during the last recessio
  • During the past decade, many local health departments have seen annual rounds of cuts, punctuated with one-time infusions of money following crises such as outbreaks of Zika, Ebola, measles and hepatitis. The problem with that cycle of feast or famine funding is that the short-term money quickly dries up and does nothing to address long-term preparedness.
  • “It’s a silly strategic approach when you think about what’s needed to protect us long term,”
  • She compared the country’s public health system to a house with deep cracks in the foundation. The emergency surges of funding are superficial repairs that leave those cracks unaddressed.
  • “We came into this pandemic at a severe deficit and are still without a strategic goal to build back that infrastructure. We need to learn from our mistakes,”
  • With the economy tanking, the tax bases for cities and counties have shrunken dramatically — payroll taxes, sales taxes, city taxes. Many departments have started cutting staff. Federal grants are no sure thing.
  • 80 percent of counties have reported their budget was affected in the current fiscal year because of the crisis. Prospects are even more dire for future budget periods, when the full impact of reduced tax revenue will become evident.
  • Christine Hahn, medical director for Idaho’s division of public health and a 25-year public health veteran, has seen the state make progress in coronavirus testing and awareness. But like so many public health officials across the country taking local steps to deal with what has become a national problem, she is limited by how much government leaders say she can do and by what citizens are willing to do.
  • “I’ve been through SARS, the 2009 pandemic, the anthrax attacks, and of course I’m in rural Idaho, not New York City and California,” Hahn said. “But I will say this is way beyond anything I’ve ever experienced as far as stress, workload, complexity, frustration, media and public interest, individual citizens really feeling very strongly about what we’re doing and not doing.”
  • “I think the general population didn’t really realize we didn’t have a health department. They just kind of assumed that was one of those government agencies we had,” Taylor said. “Then the pandemic hit, and everyone was like, ‘Wait, hold on — we don’t have a health department? Why don’t we have a health department?’ ”
  • “People locally are looking to see what’s happening in other states, and we’re constantly having to talk about that and address that,”
  • “I’m mindful of the credibility of our messaging as people say, ‘What about what they’re doing in this place? Why are we not doing what they’re doing?’ ”
  • Many health experts worry the challenges will multiply in the fall with the arrival of flu season.
  • “The unfolding tragedy here is we need people to see local public health officials as heroes in the same way that we laud heart surgeons and emergency room doctors,” Westergaard, the Wisconsin epidemiologist, said. “The work keeps getting higher, and they’re falling behind — and not feeling appreciated by their communities.”
Javier E

How Will the Coronavirus End? - The Atlantic - 0 views

  • A global pandemic of this scale was inevitable. In recent years, hundreds of health experts have written books, white papers, and op-eds warning of the possibility. Bill Gates has been telling anyone who would listen, including the 18 million viewers of his TED Talk.
  • We realized that her child might be one of the first of a new cohort who are born into a society profoundly altered by COVID-19. We decided to call them Generation C.
  • “No matter what, a virus [like SARS-CoV-2] was going to test the resilience of even the most well-equipped health systems,”
  • ...56 more annotations...
  • To contain such a pathogen, nations must develop a test and use it to identify infected people, isolate them, and trace those they’ve had contact with. That is what South Korea, Singapore, and Hong Kong did to tremendous effect. It is what the United States did not.
  • That a biomedical powerhouse like the U.S. should so thoroughly fail to create a very simple diagnostic test was, quite literally, unimaginable. “I’m not aware of any simulations that I or others have run where we [considered] a failure of testing,”
  • The testing fiasco was the original sin of America’s pandemic failure, the single flaw that undermined every other countermeasure. If the country could have accurately tracked the spread of the virus, hospitals could have executed their pandemic plans, girding themselves by allocating treatment rooms, ordering extra supplies, tagging in personnel, or assigning specific facilities to deal with COVID-19 cases.
  • None of that happened. Instead, a health-care system that already runs close to full capacity, and that was already challenged by a severe flu season, was suddenly faced with a virus that had been left to spread, untracked, through communities around the country.
  • With little room to surge during a crisis, America’s health-care system operates on the assumption that unaffected states can help beleaguered ones in an emergency.
  • That ethic works for localized disasters such as hurricanes or wildfires, but not for a pandemic that is now in all 50 states. Cooperation has given way to competition
  • Partly, that’s because the White House is a ghost town of scientific expertise. A pandemic-preparedness office that was part of the National Security Council was dissolved in 2018. On January 28, Luciana Borio, who was part of that team, urged the government to “act now to prevent an American epidemic,” and specifically to work with the private sector to develop fast, easy diagnostic tests. But with the office shuttered, those warnings were published in The Wall Street Journal, rather than spoken into the president’s ear.
  • Rudderless, blindsided, lethargic, and uncoordinated, America has mishandled the COVID-19 crisis to a substantially worse degree than what every health expert I’ve spoken with had feared. “Much worse,”
  • “Beyond any expectations we had,” said Lauren Sauer, who works on disaster preparedness at Johns Hopkins Medicine. “As an American, I’m horrified,” said Seth Berkley, who heads Gavi, the Vaccine Alliance. “The U.S. may end up with the worst outbreak in the industrialized world.”
  • it will be difficult—but not impossible—for the United States to catch up. To an extent, the near-term future is set because COVID-19 is a slow and long illness. People who were infected several days ago will only start showing symptoms now, even if they isolated themselves in the meantime. Some of those people will enter intensive-care units in early April
  • A “massive logistics and supply-chain operation [is] now needed across the country,” says Thomas Inglesby of Johns Hopkins Bloomberg School of Public Health. That can’t be managed by small and inexperienced teams scattered throughout the White House. The solution, he says, is to tag in the Defense Logistics Agency—a 26,000-person group that prepares the U.S. military for overseas operations and that has assisted in past public-health crises, including the 2014 Ebola outbreak.
  • The first and most important is to rapidly produce masks, gloves, and other personal protective equipment
  • it would also come at a terrible cost: SARS-CoV-2 is more transmissible and fatal than the flu, and it would likely leave behind many millions of corpses and a trail of devastated health systems.
  • This agency can also coordinate the second pressing need: a massive rollout of COVID-19 tests.
  • These measures will take time, during which the pandemic will either accelerate beyond the capacity of the health system or slow to containable levels. Its course—and the nation’s fate—now depends on the third need, which is social distancing.
  • There are now only two groups of Americans. Group A includes everyone involved in the medical response, whether that’s treating patients, running tests, or manufacturing supplies. Group B includes everyone else, and their job is to buy Group A more time. Group B must now “flatten the curve” by physically isolating themselves from other people to cut off chains of transmission.
  • Given the slow fuse of COVID-19, to forestall the future collapse of the health-care system, these seemingly drastic steps must be taken immediately, before they feel proportionate, and they must continue for several weeks.
  • Persuading a country to voluntarily stay at home is not easy, and without clear guidelines from the White House, mayors, governors, and business owners have been forced to take their own steps.
  • when the good of all hinges on the sacrifices of many, clear coordination matters—the fourth urgent need
  • Pundits and business leaders have used similar rhetoric, arguing that high-risk people, such as the elderly, could be protected while lower-risk people are allowed to go back to work. Such thinking is seductive, but flawed. It overestimates our ability to assess a person’s risk, and to somehow wall off the ‘high-risk’ people from the rest of society. It underestimates how badly the virus can hit ‘low-risk’ groups, and how thoroughly hospitals will be overwhelmed if even just younger demographics are falling sick.
  • A recent analysis from the University of Pennsylvania estimated that even if social-distancing measures can reduce infection rates by 95 percent, 960,000 Americans will still need intensive care.
  • There are only about 180,000 ventilators in the U.S. and, more pertinently, only enough respiratory therapists and critical-care staff to safely look after 100,000 ventilated patients. Abandoning social distancing would be foolish. Abandoning it now, when tests and protective equipment are still scarce, would be catastrophic.
  • If Trump stays the course, if Americans adhere to social distancing, if testing can be rolled out, and if enough masks can be produced, there is a chance that the country can still avert the worst predictions about COVID-19, and at least temporarily bring the pandemic under control. No one knows how long that will take, but it won’t be quick. “It could be anywhere from four to six weeks to up to three months,” Fauci said, “but I don’t have great confidence in that range.”
  • there are three possible endgames: one that’s very unlikely, one that’s very dangerous, and one that’s very long.
  • The first is that every nation manages to simultaneously bring the virus to heel, as with the original SARS in 2003. Given how widespread the coronavirus pandemic is, and how badly many countries are faring, the odds of worldwide synchronous control seem vanishingly small.
  • The second is that the virus does what past flu pandemics have done: It burns through the world and leaves behind enough immune survivors that it eventually struggles to find viable hosts. This “herd immunity” scenario would be quick, and thus tempting
  • The U.S. has fewer hospital beds per capita than Italy. A study released by a team at Imperial College London concluded that if the pandemic is left unchecked, those beds will all be full by late April. By the end of June, for every available critical-care bed, there will be roughly 15 COVID-19 patients in need of one.  By the end of the summer, the pandemic will have directly killed 2.2 million Americans,
  • The third scenario is that the world plays a protracted game of whack-a-mole with the virus, stamping out outbreaks here and there until a vaccine can be produced. This is the best option, but also the longest and most complicated.
  • there are no existing vaccines for coronaviruses—until now, these viruses seemed to cause diseases that were mild or rare—so researchers must start from scratch.
  • The first steps have been impressively quick. Last Monday, a possible vaccine created by Moderna and the National Institutes of Health went into early clinical testing. That marks a 63-day gap between scientists sequencing the virus’s genes for the first time and doctors injecting a vaccine candidate into a person’s arm. “It’s overwhelmingly the world record,” Fauci said.
  • The initial trial will simply tell researchers if the vaccine seems safe, and if it can actually mobilize the immune system. Researchers will then need to check that it actually prevents infection from SARS-CoV-2. They’ll need to do animal tests and large-scale trials to ensure that the vaccine doesn’t cause severe side effects. They’ll need to work out what dose is required, how many shots people need, if the vaccine works in elderly people, and if it requires other chemicals to boost its effectiveness.
  • No matter which strategy is faster, Berkley and others estimate that it will take 12 to 18 months to develop a proven vaccine, and then longer still to make it, ship it, and inject it into people’s arms.
  • as the status quo returns, so too will the virus. This doesn’t mean that society must be on continuous lockdown until 2022. But “we need to be prepared to do multiple periods of social distancing,” says Stephen Kissler of Harvard.
  • First: seasonality. Coronaviruses tend to be winter infections that wane or disappear in the summer. That may also be true for SARS-CoV-2, but seasonal variations might not sufficiently slow the virus when it has so many immunologically naive hosts to infect.
  • Second: duration of immunity. When people are infected by the milder human coronaviruses that cause cold-like symptoms, they remain immune for less than a year. By contrast, the few who were infected by the original SARS virus, which was far more severe, stayed immune for much longer.
  • scientists will need to develop accurate serological tests, which look for the antibodies that confer immunity. They’ll also need to confirm that such antibodies actually stop people from catching or spreading the virus. If so, immune citizens can return to work, care for the vulnerable, and anchor the economy during bouts of social distancing.
  • Aspects of America’s identity may need rethinking after COVID-19. Many of the country’s values have seemed to work against it during the pandemic. Its individualism, exceptionalism, and tendency to equate doing whatever you want with an act of resistance meant that when it came time to save lives and stay indoors, some people flocked to bars and clubs.
  • “We can keep schools and businesses open as much as possible, closing them quickly when suppression fails, then opening them back up again once the infected are identified and isolated. Instead of playing defense, we could play more offense.”
  • The vaccine may need to be updated as the virus changes, and people may need to get revaccinated on a regular basis, as they currently do for the flu. Models suggest that the virus might simmer around the world, triggering epidemics every few years or so. “But my hope and expectation is that the severity would decline, and there would be less societal upheaval,”
  • After infections begin ebbing, a secondary pandemic of mental-health problems will follow.
  • But “there is also the potential for a much better world after we get through this trauma,”
  • Testing kits can be widely distributed to catch the virus’s return as quickly as possible. There’s no reason that the U.S. should let SARS-CoV-2 catch it unawares again, and thus no reason that social-distancing measures need to be deployed as broadly and heavy-handedly as they now must be.
  • Pandemics can also catalyze social change. People, businesses, and institutions have been remarkably quick to adopt or call for practices that they might once have dragged their heels on, including working from home, conference-calling to accommodate people with disabilities, proper sick leave, and flexible child-care arrangements.
  • Perhaps the nation will learn that preparedness isn’t just about masks, vaccines, and tests, but also about fair labor policies and a stable and equal health-care system. Perhaps it will appreciate that health-care workers and public-health specialists compose America’s social immune system, and that this system has been suppressed.
  • Attitudes to health may also change for the better. The rise of HIV and AIDS “completely changed sexual behavior among young people who were coming into sexual maturity at the height of the epidemic,”
  • Years of isolationist rhetoric had consequences too.
  • “People believed the rhetoric that containment would work,” says Wendy Parmet, who studies law and public health at Northeastern University. “We keep them out, and we’ll be okay. When you have a body politic that buys into these ideas of isolationism and ethnonationalism, you’re especially vulnerable when a pandemic hits.”
  • Pandemics are democratizing experiences. People whose privilege and power would normally shield them from a crisis are facing quarantines, testing positive, and losing loved ones. Senators are falling sick. The consequences of defunding public-health agencies, losing expertise, and stretching hospitals are no longer manifesting as angry opinion pieces, but as faltering lungs.
  • After COVID-19, attention may shift to public health. Expect to see a spike in funding for virology and vaccinology, a surge in students applying to public-health programs, and more domestic production of medical supplies.
  • The lessons that America draws from this experience are hard to predict, especially at a time when online algorithms and partisan broadcasters only serve news that aligns with their audience’s preconceptions.
  • “The transitions after World War II or 9/11 were not about a bunch of new ideas,” he says. “The ideas are out there, but the debates will be more acute over the next few months because of the fluidity of the moment and willingness of the American public to accept big, massive changes.”
  • One could easily conceive of a world in which most of the nation believes that America defeated COVID-19. Despite his many lapses, Trump’s approval rating has surged. Imagine that he succeeds in diverting blame for the crisis to China, casting it as the villain and America as the resilient hero.
  • One could also envisage a future in which America learns a different lesson. A communal spirit, ironically born through social distancing, causes people to turn outward, to neighbors both foreign and domestic. The election of November 2020 becomes a repudiation of “America first” politics. The nation pivots, as it did after World War II, from isolationism to international cooperation
  • The U.S. leads a new global partnership focused on solving challenges like pandemics and climate change.
  • In 2030, SARS-CoV-3 emerges from nowhere, and is brought to heel within a month.
  • On the Global Health Security Index, a report card that grades every country on its pandemic preparedness, the United States has a score of 83.5—the world’s highest. Rich, strong, developed, America is supposed to be the readiest of nations. That illusion has been shattered. Despite months of advance warning as the virus spread in other countries, when America was finally tested by COVID-19, it failed.
Javier E

Medical Mystery: Something Happened to U.S. Health Spending After 1980 - The New York T... - 0 views

  • The United States devotes a lot more of its economic resources to health care than any other nation, and yet its health care outcomes aren’t better for it.
  • That hasn’t always been the case. America was in the realm of other countries in per-capita health spending through about 1980. Then it diverged.
  • It’s the same story with health spending as a fraction of gross domestic product. Likewise, life expectancy. In 1980, the U.S. was right in the middle of the pack of peer nations in life expectancy at birth. But by the mid-2000s, we were at the bottom of the pack.
  • ...30 more annotations...
  • “Medical care is one of the less important determinants of life expectancy,” said Joseph Newhouse, a health economist at Harvard. “Socioeconomic status and other social factors exert larger influences on longevity.”
  • The United States has relied more on market forces, which have been less effective.
  • For spending, many experts point to differences in public policy on health care financing. “Other countries have been able to put limits on health care prices and spending” with government policies
  • One result: Prices for health care goods and services are much higher in the United States.
  • “The differential between what the U.S. and other industrialized countries pay for prescriptions and for hospital and physician services continues to widen over time,”
  • The degree of competition, or lack thereof, in the American health system plays a role
  • periods of rapid growth in U.S. health care spending coincide with rapid growth in markups of health care prices. This is what one would expect in markets with low levels of competition.
  • Although American health care markets are highly consolidated, which contributes to higher prices, there are also enough players to impose administrative drag. Rising administrative costs — like billing and price negotiations across many insurers — may also explain part of the problem.
  • The additional costs associated with many insurers, each requiring different billing documentation, adds inefficiency
  • “We have big pharma vs. big insurance vs. big hospital networks, and the patient and employers and also the government end up paying the bills,”
  • Though we have some large public health care programs, they are not able to keep a lid on prices. Medicare, for example, is forbidden to negotiate as a whole for drug prices,
  • once those spending constraints eased, “suppliers of medical inputs marketed very costly technological innovations with gusto,”
  • , all across the world, one sees constraints on payment, technology, etc., in the 1970s and 1980s,” he said. The United States is not different in kind, only degree; our constraints were weaker.
  • Mr. Starr suggests that the high inflation of the late 1970s contributed to growth in health care spending, which other countries had more systems in place to control
  • These are all highly valuable, but they came at very high prices. This willingness to pay more has in turn made the United States an attractive market for innovation in health care.
  • The last third of the 20th century or so was a fertile time for expensive health care innovation
  • being an engine for innovation doesn’t necessarily translate into better outcomes.
  • international differences in rates of smoking, obesity, traffic accidents and homicides cannot explain why Americans tend to die younger.
  • Some have speculated that slower American life expectancy improvements are a result of a more diverse population
  • But Ms. Glied and Mr. Muennig found that life expectancy growth has been higher in minority groups in the United States
  • even accounting for motor vehicle traffic crashes, firearm-related injuries and drug poisonings, the United States has higher mortality rates than comparably wealthy countries.
  • The lack of universal health coverage and less safety net support for low-income populations could have something to do with it
  • “The most efficient way to improve population health is to focus on those at the bottom,” she said. “But we don’t do as much for them as other countries.”
  • The effectiveness of focusing on low-income populations is evident from large expansions of public health insurance for pregnant women and children in the 1980s. There were large reductions in child mortality associated with these expansions.
  • A report by RAND shows that in 1980 the United States spent 11 percent of its G.D.P. on social programs, excluding health care, while members of the European Union spent an average of about 15 percent. In 2011 the gap had widened to 16 percent versus 22 percent.
  • “Social underfunding probably has more long-term implications than underinvestment in medical care,” he said. For example, “if the underspending is on early childhood education — one of the key socioeconomic determinants of health — then there are long-term implications.”
  • Slow income growth could also play a role because poorer health is associated with lower incomes. “It’s notable that, apart from the richest of Americans, income growth stagnated starting in the late 1970s,”
  • History demonstrates that it is possible for the U.S. health system to perform on par with other wealthy countries
  • That doesn’t mean it’s a simple matter to return to international parity. A lot has changed in 40 years. What began as small gaps in performance are now yawning chasms
  • “For starters, we could have a lot more competition in health care. And government programs should often pay less than they do.” He added that if savings could be reaped from these approaches, and others — and reinvested in improving the welfare of lower-income Americans — we might close both the spending and longevity gaps.
anonymous

Covid-19 Relief Bill Fulfills Biden's Promise to Expand Obamacare, for Two Years - The ... - 0 views

  • President Biden’s $1.9 trillion coronavirus relief bill will fulfill one of his central campaign promises, to fill the holes in the Affordable Care Act and make health insurance affordable for more than a million middle-class Americans who could not afford insurance under the original law.
  • The changes will last only for two years. But for some, they will be considerable: The Congressional Budget Office estimated that a 64-year-old earning $58,000 would see monthly payments decline from $1,075 under current law to $412 because the federal government would take up much of the cost.
  • “For people that are eligible but not buying insurance it’s a financial issue, and so upping the subsidies is going to make the price point come down,” said Ezekiel Emanuel, a health policy expert and professor at the University of Pennsylvania who advised Mr. Biden during his transition. The bill, he said, would “make a big dent in the number of the uninsured.”
  • ...24 more annotations...
  • “Obviously it’s an improvement, but I think that it is inadequate given the health care crisis that we’re in,” said Representative Ro Khanna, a progressive Democrat from California who favors the single-payer, government-run system called Medicare for All that has been embraced by Senator Bernie Sanders, independent of Vermont, and the Democratic left.
  • “We’re in a national health care crisis,” Mr. Khanna said. “Fifteen million people just lost private health insurance. This would be the time for the government to say, at the very least, for those 15 million that we ought to put them on Medicare.”
  • The stimulus bill would make upper-middle-income Americans newly eligible for financial help to buy plans on the federal marketplaces, and the premiums for those plans would cost no more than 8.5 percent of an individual’s modified adjusted gross income. It would also increase subsidies for lower-income enrollees.
  • Just when Mr. Biden or Democrats would put forth such a plan remains unclear, and passage in an evenly divided Senate would be an uphill struggle. White House officials have said Mr. Biden wants to get past the coronavirus relief bill before laying out a more comprehensive domestic policy agenda.
  • The Affordable Care Act is near and dear to Mr. Biden, who memorably used an expletive to describe it as a big deal when he was vice president and President Barack Obama signed it into law in 2010. It has expanded coverage to more than 20 million Americans, cutting the uninsured rate to 10.9 percent in 2019 from 17.8 percent in 2010.
  • Even so, some 30 million Americans were uninsured between January and June 2020, according to the latest figures available from the National Health Interview Survey. The problem has only grown worse during the coronavirus pandemic, when thousands if not millions of Americans lost insurance because they lost their jobs.
  • Mr. Biden made clear when he was running for the White House that he did not favor Medicare for All, but instead wanted to strengthen and expand the Affordable Care Act. The bill that is expected to reach his desk in time for a prime-time Oval Office address on Thursday night would do that. The changes to the health law would cover 1.3 million more Americans and cost about $34 billion, according to the Congressional Budget Office.
  • Republicans have always said that their plan was to repeal and replace the health law, but after 10 years they have yet to come up with a replacement. Mr. Ayres said his firm is working on “coming up with some alternative health care message” that does not involve “simply throwing everybody into a government-run health care problem.”
  • Yet polls show that the idea of a government-run program is gaining traction with voters. In September, the Pew Research Center reported that over the previous year, there had been an increase, especially among Democrats, in the share of Americans who say health insurance should be provided by a single national program run by the government.
  • “I would argue there is more momentum for Medicare expansion given the pandemic and the experience people are having,” said Mr. Khanna, the California congressman. “They bought time, but I think at some point there will be a debate on a permanent fix.”
  • WASHINGTON — President Biden’s $1.9 trillion coronavirus relief bill will fulfill one of his central campaign promises, to fill the holes in the Affordable Care Act and make health insurance affordable for more than a million middle-class Americans who could not afford insurance under the original law.
  • Under the changes, the signature domestic achievement of the Obama administration will reach middle-income families who have been discouraged from buying health plans on the federal marketplace because they come with high premiums and little or no help from the government.
  • “For people that are eligible but not buying insurance it’s a financial issue, and so upping the subsidies is going to make the price point come down,” said Ezekiel Emanuel, a health policy expert and professor at the University of Pennsylvania who advised Mr. Biden during his transition.
  • But because those provisions last only two years, the relief bill almost guarantees that health care will be front and center in the 2022 midterm elections, when Republicans will attack the measure as a wasteful expansion of a health law they have long hated. Meantime, some liberal Democrats may complain that the changes only prove that a patchwork approach to health care coverage will never work.
  • The Affordable Care Act is near and dear to Mr. Biden, who memorably used an expletive to describe it as a big deal when he was vice president and President Barack Obama signed it into law in 2010. It has expanded coverage to more than 20 million Americans, cutting the uninsured rate to 10.9 percent in 2019 from 17.8 percent in 2010.
  • The poll found that 36 percent of Americans, and 54 percent of Democrats, favored a single national program. When asked if the government had a responsibility to provide health insurance, either through a single national program or a mix of public and private programs, 63 percent of Americans and 88 percent of Democrats said yes.
  • Just when Mr. Biden or Democrats would put forth such a plan remains unclear, and passage in an evenly divided Senate would be an uphill struggle. White House officials have said Mr. Biden wants to get past the coronavirus relief bill before laying out a more comprehensive domestic policy agenda.
  • Republicans have always said that their plan was to repeal and replace the health law, but after 10 years they have yet to come up with a replacement. Mr. Ayres said his firm is working on “coming up with some alternative health care message” that does not involve “simply throwing everybody into a government-run health care problem.”
  • In January, he ordered the Affordable Care Act’s health insurance marketplaces reopened to give people throttled by the pandemic economy a new chance to obtain coverage.
  • Yet polls show that the idea of a government-run program is gaining traction with voters. In September, the Pew Research Center reported that over the previous year, there had been an increase, especially among Democrats, in the share of Americans who say health insurance should be provided by a single national program run by the government.
  • With its expanded subsidies for health plans under the Affordable Care Act, the coronavirus relief bill makes insurance more affordable, and puts health care on the ballot in 2022.
  • cludes rich new incentives to entice the few holdout states — including Texas, Georgia and Florida — to finally expand Medicaid to those with too much money to qualify for the federal health program for the poor, but too little to afford private covera
  • “Biden promised voters a public option, and it is a promise he has to keep,” said Waleed Shahid, a spokesman for Justice Democrats, the liberal group that helped elect Representative Alexandria Ocasio-Cortez and other progressive Democrats. Of the stimulus bill, he said, “I don’t think anyone thinks this is Biden’s health care plan.”
  • “I think that argument has been fought and lost,” said Whit Ayres, a Republican pollster, conceding that the repeal efforts are over, at least for now, with Democrats in charge of the White House and both houses of Congress.
Javier E

How Coronavirus Overpowered the World Health Organization - WSJ - 1 views

  • The WHO spent years and hundreds of millions of dollars honing a globe-spanning system of defenses against a pandemic it knew would come. But the virus moved faster than the United Nations agency, exposing flaws in its design and operation that bogged down its response when the world needed to take action.
  • The WHO relied on an honor system to stop a viral cataclysm. Its member states had agreed to improve their ability to contain infectious disease epidemics and to report any outbreaks that might spread beyond their borders. International law requires them to do both.
  • Time and again, countries big and small have failed to do so. The WHO, which isn’t a regulatory agency, lacks the authority to force information from the very governments that finance its programs and elect its leaders
  • ...49 more annotations...
  • years of painstakingly worded treaties, high-level visits and cutting-edge disease surveillance—all meant to encourage good-faith cooperation—have only bitten around the edges of the problem.
  • “It can’t demand entry into a country because they think something bad is happening.”
  • Nearly 200 countries were counting on an agency whose budget—roughly $2.4 billion in 2020—is less than a sixth of the Maryland Department of Health’s. Its donors, largely Western governments, earmark most of that money for causes other than pandemic preparedness.
  • In 2018 and 2019, about 8% of the WHO’s budget went to activities related to pandemic preparedness
  • the agency’s bureaucratic structure, diplomatic protocol and funding were no match for a pandemic as widespread and fast-moving as Covid-19.
  • To write its recommendations, the WHO solicits outside experts, which can be a slow process.
  • It took those experts more than four months to agree that widespread mask-wearing helps, and that people who are talking, shouting or singing can expel the virus through tiny particles that linger in the air. In that time, about half a million people died.
  • As months rolled on, it became clear that governments were reluctant to allow the U.N. to scold, shame or investigate them.
  • In particular, The Wall Street Journal found:
  • * China appears to have violated international law requiring governments to swiftly inform the WHO and keep it in the loop about an alarming infectious-disease cluster
  • —there are no clear consequences for violations
  • * The WHO lost a critical week waiting for an advisory panel to recommend a global public-health emergency, because some of its members were overly hopeful that the new disease wasn’t easily transmissible from one person to another.
  • * The institution overestimated how prepared some wealthy countries were, while focusing on developing countries, where much of its ordinary assistance is directed
  • Public-health leaders say the WHO plays a critical role in global health, leading responses to epidemics and setting health policies and standards for the world. It coordinates a multinational effort every year to pick the exact strains that go into the seasonal flu vaccine, and has provided public guidance and advice on Covid-19 when many governments were silent.
  • The world’s public-health agency was born weak, created in 1948 over U.S. and U.K. reluctance. For decades, it was legally barred from responding to diseases that it learned about from the news. Countries were required to report outbreaks of only four diseases to the WHO: yellow fever, plague, cholera and smallpox, which was eradicated in 1980.
  • Nearly three times that amount was budgeted for eradicating polio, a top priority for the WHO’s two largest contributors: the U.S. and the Bill & Melinda Gates Foundation.
  • SARS convinced governments to retool the WHO. The next year, delegates arrived in the Geneva palace where the League of Nations once met to resolve a centuries-old paradox: Countries don’t report outbreaks, because they fear—correctly—their neighbors will respond by blocking travel and trade.
  • “Everybody pushed back. No sovereign country wants to have this.”
  • China wanted an exemption from immediately reporting SARS outbreaks. The U.S. argued it couldn’t compel its 50 states to cooperate with the treaty. Iran blocked American proposals to make the WHO focus on bioterrorism. Cuba had an hourslong list of objections.
  • Around 3:15 a.m. on the last day, exhausted delegates ran out of time. The treaty they approved, called the International Health Regulations, imagined that each country would quickly and honestly report, then contain, any alarming outbreaks
  • In return, the treaty discouraged restrictions on travel and trade. There would be no consequences for reporting an outbreak—yet no way to punish a country for hiding one.
  • The treaty’s key chokepoint: Before declaring a “public health emergency of international concern,” or PHEIC, the WHO’s director-general would consult a multinational emergency committee and give the country in question a chance to argue against such a declaration.
  • Delegates agreed this could give some future virus a head start but decided it was more important to discourage the WHO from making any unilateral announcements that could hurt their economies.
  • On Jan. 22, a committee of 15 scientists haggled for hours over Chinese data and a handful of cases in other countries. Clearly, the virus was spreading between people in China, though there was no evidence of that in other countries. The question now: Was it mainly spreading from very sick people in hospitals and homes—or more widely?
  • On Jan. 3, representatives of China’s National Health Commission arrived at the WHO office in Beijing. The NHC acknowledged a cluster of pneumonia cases, but didn’t confirm that the new pathogen was a coronavirus, a fact Chinese officials already knew.
  • That same day, the NHC issued an internal notice ordering laboratories to hand over or destroy testing samples and forbade anyone from publishing unauthorized research on the virus.
  • China’s failure to notify the WHO of the cluster of illnesses is a violation of the International Health Regulations
  • China also flouted the IHR by not disclosing all key information it had to the WHO
  • The WHO said it’s up to member states to decide whether a country has complied with international health law, and that the coming review will address those issues.
  • While Chinese scientists had sequenced the genome and posted it publicly, the government was less forthcoming about how patients might be catching the virus.
  • WHO scientists pored over data they did get, and consulted with experts from national health agencies, including the CDC, which has 33 staff detailed to the WHO.
  • Then a 61-year-old woman was hospitalized in Thailand on Jan. 13.
  • The next day, Dr. van Kerkhove told reporters: “It’s certainly possible that there is limited human-to-human transmission.” MERS and SARS, both coronaviruses, were transmissible among people in close quarters. Epidemiological investigations were under way, she said.
  • Over the next few years, emergency committees struggled over how to determine whether an outbreak was a PHEIC. It took months to declare emergencies for two deadly Ebola epidemics
  • The committee met over two days, but was split. They mostly agreed on one point: The information from China “was a little too imprecise to very clearly state that it was time” to recommend an emergency declaration,
  • On Jan. 28, Dr. Tedros and the WHO team arrived for their meeting with Mr. Xi
  • Leaning across three wooden coffee tables, Dr. Tedros pressed for cooperation. In the absence of information, countries might react out of fear and restrict travel to China, he repeated several times throughout the trip. Mr. Xi agreed to allow a WHO-led international team of experts to visit. It took until mid-February to make arrangements and get the team there.
  • China also agreed to provide more data, and Dr. Tedros departed, leaving Dr. Briand behind with a list of mysteries to solve. How contagious was the virus? How much were children or pregnant women at risk? How were cases linked? This was vital information needed to assess the global risk, Dr. Briand said
  • Back in Geneva, Dr. Tedros reconvened the emergency committee. By now it was clear there was human-to-human transmission in other countries. When it met on Jan. 30, the committee got the information the WHO had been seeking. This time the committee recommended and Dr. Tedros declared a global public-health emergency.
  • President Trump and New York Gov. Andrew Cuomo both assured constituents their health systems would perform well. The U.K.’s chief medical officer described the WHO’s advice as largely directed at poor and middle-income countries. As for keeping borders open, by then many governments had already closed them to visitors from China.
  • The WHO shifted focus to the developing world, where it believed Covid-19 would exact the heaviest toll. To its surprise, cases shot up just across the border, in northern Italy.
  • Lessons learned
  • If there were one thing the WHO might have done differently, it would be to offer wealthier countries the type of assistance with public-health interventions that the WHO provides the developing world
  • the WHO’s warning system of declaring a global public-health emergency needs to change. Some want to see a warning system more like a traffic light—with color-coded alarms for outbreaks, based on how worried the public should be
  • Emergency committees need clearer criteria for declaring a global public-health emergency and should publicly explain their thinking
  • The WHO should have more powers to intervene in countries to head off a health crisis
  • the WHO’s health emergencies unit should report to the director-general and not member states, and its budget should be protected so it doesn’t have to compete with other programs for money.
  • Implementing many of those ideas would require herding diplomats back for another monthslong slog of treaty revisions. If and when such talks begin, new governments will likely be in place, and political priorities will float elsewher
  • “Unfortunately, I’m very cynical about this,” he said. “We are living through cycles of panic and neglect. We’ve been through all of this before.”
Javier E

Opinion | Vaccine Hesitancy Is About Trust and Class - The New York Times - 0 views

  • The world needs to address the root causes of vaccine hesitancy. We can’t go on believing that the issue can be solved simply by flooding skeptical communities with public service announcements or hectoring people to “believe in science.”
  • For the past five years, we’ve conducted surveys and focus groups abroad and interviewed residents of the Bronx to better understand vaccine avoidance.
  • We’ve found that people who reject vaccines are not necessarily less scientifically literate or less well-informed than those who don’t. Instead, hesitancy reflects a transformation of our core beliefs about what we owe one another.
  • ...43 more annotations...
  • Over the past four decades, governments have slashed budgets and privatized basic services. This has two important consequences for public health
  • First, people are unlikely to trust institutions that do little for them.
  • second, public health is no longer viewed as a collective endeavor, based on the principle of social solidarity and mutual obligation. People are conditioned to believe they’re on their own and responsible only for themselves.
  • an important source of vaccine hesitancy is the erosion of the idea of a common good.
  • “People are thinking, ‘If the government isn’t going to do anything for us,’” said Elden, “‘then why should we participate in vaccines?’”
  • Since the spring, when most American adults became eligible for Covid vaccines, the racial gap in vaccination rates between Black and white people has been halved. In September, a national survey found that vaccination rates among Black and white Americans were almost identical.
  • Other surveys have determined that a much more significant factor was college attendance: Those without a college degree were the most likely to go unvaccinated.
  • Education is a reliable predictor of socioeconomic status, and other studies have similarly found a link between income and vaccination.
  • It turns out that the real vaccination divide is class.
  • compared with white Americans, communities of color do experience the American health care system differently. But a closer look at the data reveals a more complicated picture.
  • during the 1950s polio campaigns, for example, most people saw vaccination as a civic duty.
  • But as the public purse shrunk in the 1980s, politicians insisted that it’s no longer the government’s job to ensure people’s well-being; instead, Americans were to be responsible only for themselves and their own bodies
  • Entire industries, such as self-help and health foods, have sprung up on the principle that the key to good health lies in individuals making the right choices.
  • Without an idea of the common good, health is often discussed using the language of “choice.”
  • there are problems with reducing public health to a matter of choice. It gives the impression that individuals are wholly responsible for their own health.
  • This is despite growing evidence that health is deeply influenced by factors outside our control; public health experts now talk about the “social determinants of health,” the idea that personal health is never simply just a reflection of individual lifestyle choices, but also the class people are born into, the neighborhood they grew up in and the race they belong to.
  • food deserts and squalor are not easy problems to solve — certainly not by individuals or charities — and they require substantial government action.
  • Many medical schools teach “motivational interviewing,”
  • the deeper problem:
  • Being healthy is not cheap. Studies indicate that energy-dense foods with less nutritious value are more affordable, and low-cost diets are linked to obesity and insulin resistance.
  • This isn’t surprising, since we shop for doctors and insurance plans the way we do all other goods and services
  • Another problem with reducing well-being to personal choice is that this treats health as a commodity.
  • mothers devoted many hours to “researching” vaccines, soaking up parental advice books and quizzing doctors. In other words, they act like savvy consumers
  • When thinking as a consumer, people tend to downplay social obligations in favor of a narrow pursuit of self-interest. As one parent told Reich, “I’m not going to put my child at risk to save another child.”
  • Such risk-benefit assessments for vaccines are an essential part of parents’ consumer research.
  • Vaccine uptake is so high among wealthy people because Covid is one of the gravest threats they face. In some wealthy Manhattan neighborhoods, for example, vaccination rates run north of 90 percent.
  • For poorer and working-class people, though, the calculus is different: Covid-19 is only one of multiple grave threats.
  • When viewed in the context of the other threats they face, Covid no longer seems uniquely scary.
  • Most of the people we interviewed in the Bronx say they are skeptical of the institutions that claim to serve the poor but in fact have abandoned them.
  • he and his friends find reason to view the government’s sudden interest in their well-being with suspicion. “They are over here shoving money at us,” a woman told us, referring to a New York City offer to pay a $500 bonus to municipal workers to get vaccinated. “And I’m asking, why are you so eager, when you don’t give us money for anything else?”
  • These views reinforce the work of social scientists who find a link between a lack of trust and inequality. And without trust, there is no mutual obligation, no sense of a common good.
  • The experience of the 1960s suggests that when people feel supported through social programs, they’re more likely to trust institutions and believe they have a stake in society’s health.
  • Research shows that private systems not only tend to produce worse health outcomes than public ones, but privatization creates what public health experts call “segregated care,” which can undermine the feelings of social solidarity that are critical for successful vaccination drives
  • In one Syrian city, for example, the health care system now consists of one public hospital so underfunded that it is notorious for poor care, a few private hospitals offering high-quality care that are unaffordable to most of the population, and many unlicensed and unregulated private clinics — some even without medical doctors — known to offer misguided health advice. Under such conditions, conspiracy theories can flourish; many of the city’s residents believe Covid vaccines are a foreign plot.
  • In many developing nations, international aid organizations are stepping in to offer vaccines. These institutions are sometimes more equitable than governments, but they are often oriented to donor priorities, not community needs.
  • “We have starvation and women die in childbirth.” one tribal elder told us, “Why do they care so much about polio? What do they really want?”
  • In America, anti-vaccine movements are as old as vaccines themselves; efforts to immunize people against smallpox prompted bitter opposition in the turn of the last century. But after World War II, these attitudes disappeared. In the 1950s, demand for the polio vaccine often outstripped supply, and by the late 1970s, nearly every state had laws mandating vaccinations for school with hardly any public opposition.
  • What changed? This was the era of large, ambitious government programs like Medicare and Medicaid.
  • The anti-measles policy, for example, was an outgrowth of President Lyndon Johnson’s Great Society and War on Poverty initiatives.
  • While the reasons vary by country, the underlying causes are the same: a deep mistrust in local and international institutions, in a context in which governments worldwide have cut social services.
  • Only then do the ideas of social solidarity and mutual obligation begin to make sense.
  • The types of social programs that best promote this way of thinking are universal ones, like Social Security and universal health care.
  • If the world is going to beat the pandemic, countries need policies that promote a basic, but increasingly forgotten, idea: that our individual flourishing is bound up in collective well-being.
clairemann

AOC and Rashida Tlaib's Public Banking Act, explained - Vox - 0 views

  • A public option, but for banking. That’s what Reps. Rashida Tlaib and Alexandria Ocasio-Cortez are proposing in a new bill unveiled on Friday.
  • would foster the creation of public banks across the country by providing them a pathway to getting started, establishing an infrastructure for liquidity and credit facilities for them via the Federal Reserve, and setting up federal guidelines for them to be regulated.
  • which theoretically would be more motivated to do public good and invest in their communities than private institutions, which are out for profit.
  • ...28 more annotations...
  • The proposal lands in the midst of the Covid-19 pandemic, which has shed light on many inefficiencies in the American system, including banking. Take the Paycheck Protection Program, for example: It used the regular banking system as an intermediary, which ultimately meant that bigger businesses and those with preexisting relationships with those banks were prioritized over others.
  • guarantee a more equitable recovery by providing an alternative to Wall Street banks for state and local governments, businesses, and ordinary people,
  • The public banking bill also does double duty as a climate bill: It would prohibit public banks from investing in or doing business with the fossil fuel industry.
  • “Public banks empower states and municipalities to establish new channels of public investment to help solve systemic crises.”
  • But, he said, this proposal is particularly comprehensive and supportive.
  • If Democrats keep control of the House come 2021 and manage to flip the Senate and win the White House, they’ll be able to take some big legislative swings, including and perhaps especially on issues related to the economy.
  • at some point it’s just hitting a wall where it doesn’t carry them along and they’re looking for options,” said Tlaib, who represents Michigan’s 13th Congressional District, the third-poorest congressional district in the country. “So I’m putting this on the table as an option.”
  • To be clear, the Public Banking Act isn’t creating a federal public bank.
  • encourage and enable the creation of public banks across the US. It provides legitimacy to those who are pushing for more public banking, and it also includes regulators as key stakeholders who can support and provide guidance for how those banks should operate.
  • though different public banks would likely have different areas of emphasis.
  • They could also facilitate easier access to funds for state and local governments from the federal government or Federal Reserve.
  • “It’s basically a way to finance state and local investment that doesn’t go through Wall Street and doesn’t leave the community and turn into a windfall for shareholders,
  • “This is more about community development.”
  • Tlaib recalled hearing from her constituents when the $1,200 coronavirus stimulus checks went out this spring — people waiting days and weeks for direct deposits, or getting a check in the mail only to lose a substantial portion of it cashing it at the store down the street.
  • The Public Banking Act allows the Federal Reserve to charter and grant membership to public banks and creates a grant program for the Treasury secretary to provide seed money for public banks to be formed, capitalized, and developed.
  • Public banks need the FDIC to provide assurances that it will recognize them in accordance with the bond rating of the city or state they represent.
  • McConnell said the FDIC issuing guidance that it recognizes the city’s — and the state’s — public banks as an AAA rating would send a clear direction to the state financial regulators that the public bank is considered low risk.
  • The bill would also provide a road map for the FDIC, which insures bank deposits of up to $250,000, to insure deposits for public banks, so people feel assured they won’t lose all their money by choosing to open an account with their state bank instead of, say, Wells Fargo.
  • the Office of the Comptroller of the Currency (OCC) has historically been charged with chartering national banks in the US, not the Fed, meaning this is a fairly novel idea.
  • It prohibits the Fed and Treasury from considering the financial health of an entity that controls or owns a bank in grant-making decisions.
  • So here is the thing about private companies, including, yes, banks: The point of them is to make money, and that drives their decisions. It’s not necessarily evil (though sometimes it kind of is), but it’s just how they work.
  • The idea behind public banking isn’t that Goldman Sachs, Wells Fargo, and Morgan Stanley go away; it’s that they have to compete with a government-owned entity — and one that’s a little fairer and more ethical in how it does business.
  • Public banks, as imagined in the Tlaib/Ocasio-Cortez proposal, would provide loans to small businesses and governments with lower interest rates and lower fees.
  • Student loans are facilitated directly with BND, but other loans, called participation loans, go through a local financial institution — often with BND support.
  • According to a study on public banks, BND had some $2 billion in active participation loans in 2014. BND can grant larger loans at a lower risk, which fosters a healthy financial ecosystem populated by a cluster of small North Dakota banks.
  • Democrats have a lot of ideas, and if they take power come January 2021, there’s a lot they can do.
  • The Public Banking Act is meant to complement ideas such as the ABC Act and postal banking. And, of course, it’s linked to the Green New Deal, not only because it would bar public banks from financing things that hurt the environment, but also because the idea is that public banks would play a major role in financing Green New Deal and climate-friendly projects.
  • If former Vice President Joe Biden wins the White House and Democrats control both the House and the Senate come 2021, the talk around these ideas becomes a lot more serious.
Javier E

Ozempic or Bust - The Atlantic - 0 views

  • June 2024 Issue
  • Explore
  • it is impossible to know, in the first few years of any novel intervention, whether its success will last.
  • ...77 more annotations...
  • The ordinary fixes—the kind that draw on people’s will, and require eating less and moving more—rarely have a large or lasting effect. Indeed, America itself has suffered through a long, maddening history of failed attempts to change its habits on a national scale: a yo-yo diet of well-intentioned treatments, policies, and other social interventions that only ever lead us back to where we started
  • Through it all, obesity rates keep going up; the diabetes epidemic keeps worsening.
  • The most recent miracle, for Barb as well as for the nation, has come in the form of injectable drugs. In early 2021, the Danish pharmaceutical company Novo Nordisk published a clinical trial showing remarkable results for semaglutide, now sold under the trade names Wegovy and Ozempic.
  • Patients in the study who’d had injections of the drug lost, on average, close to 15 percent of their body weight—more than had ever been achieved with any other drug in a study of that size. Wadden knew immediately that this would be “an incredible revolution in the treatment of obesity.”
  • Many more drugs are now racing through development: survodutide, pemvidutide, retatrutide. (Among specialists, that last one has produced the most excitement: An early trial found an average weight loss of 24 percent in one group of participants.
  • In the United States, an estimated 189 million adults are classified as having obesity or being overweight
  • The drugs don’t work for everyone. Their major side effects—nausea, vomiting, and diarrhea—can be too intense for many patients. Others don’t end up losing any weight
  • For the time being, just 25 percent of private insurers offer the relevant coverage, and the cost of treatment—about $1,000 a month—has been prohibitive for many Americans.
  • The drugs have already been approved not just for people with diabetes or obesity, but for anyone who has a BMI of more than 27 and an associated health condition, such as high blood pressure or cholesterol. By those criteria, more than 140 million American adults already qualify
  • if this story goes the way it’s gone for other “risk factor” drugs such as statins and antihypertensives, then the threshold for prescriptions will be lowered over time, inching further toward the weight range we now describe as “normal.”
  • How you view that prospect will depend on your attitudes about obesity, and your tolerance for risk
  • The first GLP-1 drug to receive FDA approval, exenatide, has been used as a diabetes treatment for more than 20 years. No long-term harms have been identified—but then again, that drug’s long-term effects have been studied carefully only across a span of seven years
  • the data so far look very good. “These are now being used, literally, in hundreds of thousands of people across the world,” she told me, and although some studies have suggested that GLP-1 drugs may cause inflammation of the pancreas, or even tumor growth, these concerns have not borne out.
  • adolescents are injecting newer versions of these drugs, and may continue to do so every week for 50 years or more. What might happen over all that time?
  • “All of us, in the back of our minds, always wonder, Will something show up?  ” Although no serious problems have yet emerged, she said, “you wonder, and you worry.”
  • in light of what we’ve been through, it’s hard to see what other choices still remain. For 40 years, we’ve tried to curb the spread of obesity and its related ailments, and for 40 years, we’ve failed. We don’t know how to fix the problem. We don’t even understand what’s really causing it. Now, again, we have a new approach. This time around, the fix had better work.
  • The fen-phen revolution arrived at a crucial turning point for Wadden’s field, and indeed for his career. By then he’d spent almost 15 years at the leading edge of research into dietary interventions, seeing how much weight a person might lose through careful cutting of their calories.
  • But that sort of diet science—and the diet culture that it helped support—had lately come into a state of ruin. Americans were fatter than they’d ever been, and they were giving up on losing weight. According to one industry group, the total number of dieters in the country declined by more than 25 percent from 1986 to 1991.
  • Rejecting diet culture became something of a feminist cause. “A growing number of women are joining in an anti-diet movement,” The New York Times reported in 1992. “They are forming support groups and ceasing to diet with a resolve similar to that of secretaries who 20 years ago stopped getting coffee for their bosses.
  • Now Wadden and other obesity researchers were reaching a consensus that behavioral interventions might produce in the very best scenario an average lasting weight loss of just 5 to 10 percent
  • National surveys completed in 1994 showed that the adult obesity rate had surged by more than half since 1980, while the proportion of children classified as overweight had doubled. The need for weight control in America had never seemed so great, even as the chances of achieving it were never perceived to be so small.
  • Wadden wasn’t terribly concerned, because no one in his study had reported any heart symptoms. But ultrasounds revealed that nearly one-third of them had some degree of leakage in their heart valves. His “cure for obesity” was in fact a source of harm.
  • In December 1994, the Times ran an editorial on what was understood to be a pivotal discovery: A genetic basis for obesity had finally been found. Researchers at Rockefeller University were investigating a molecule, later named leptin, that gets secreted from fat cells and travels to the brain, and that causes feelings of satiety. Lab mice with mutations in the leptin gene—importantly, a gene also found in humans—overeat until they’re three times the size of other mice. “The finding holds out the dazzling hope,”
  • In April 1996, the doctors recommended yes: Dexfenfluramine was approved—and became an instant blockbuster. Patients received prescriptions by the hundreds of thousands every month. Sketchy wellness clinics—call toll-free, 1-888-4FEN-FEN—helped meet demand. Then, as now, experts voiced concerns about access. Then, as now, they worried that people who didn’t really need the drugs were lining up to take them. By the end of the year, sales of “fen” alone had surpassed $300 million.
  • It was nothing less than an awakening, for doctors and their patients alike. Now a patient could be treated for excess weight in the same way they might be treated for diabetes or hypertension—with a drug they’d have to take for the rest of their life.
  • the article heralded a “new understanding of obesity as a chronic disease rather than a failure of willpower.”
  • News had just come out that, at the Mayo Clinic in Minnesota, two dozen women taking fen-phen—including six who were, like Barb, in their 30s—had developed cardiac conditions. A few had needed surgery, and on the operating table, doctors discovered that their heart valves were covered with a waxy plaque.
  • Americans had been prescribed regular fenfluramine since 1973, and the newer drug, dexfenfluramine, had been available in France since 1985. Experts took comfort in this history. Using language that is familiar from today’s assurances regarding semaglutide and other GLP-1 drugs, they pointed out that millions were already on the medication. “It is highly unlikely that there is anything significant in toxicity to the drug that hasn’t been picked up with this kind of experience,” an FDA official named James Bilstad would later say in a Time cover story headlined “The Hot New Diet Pill.
  • “I know I can’t get any more,” she told Williams. “I have to use up what I have. And then I don’t know what I’m going to do after that. That’s the problem—and that is what scares me to death.” Telling people to lose weight the “natural way,” she told another guest, who was suggesting that people with obesity need only go on low-carb diets, is like “asking a person with a thyroid condition to just stop their medication.”
  • She’d gone off the fen-phen and had rapidly regained weight. “The voices returned and came back in a furor I’d never heard before,” Barb later wrote on her blog. “It was as if they were so angry at being silenced for so long, they were going to tell me 19 months’ worth of what they wanted me to hear. I was forced to listen. And I ate. And I ate. And ate.”
  • For Barb, rapid weight loss has brought on a different metaphysical confusion. When she looks in the mirror, she sometimes sees her shape as it was two years ago. In certain corners of the internet, this is known as “phantom fat syndrome,” but Barb dislikes that term. She thinks it should be called “body integration syndrome,” stemming from a disconnect between your “larger-body memory” and “smaller-body reality.
  • In 2003, the U.S. surgeon general declared obesity “the terror within, a threat that is every bit as real to America as the weapons of mass destruction”; a few months later, Eric Finkelstein, an economist who studies the social costs of obesity, put out an influential paper finding that excess weight was associated with up to $79 billion in health-care spending in 1998, of which roughly half was paid by Medicare and Medicaid. (Later he’d conclude that the number had nearly doubled in a decade.
  • In 2004, Finkelstein attended an Action on Obesity summit hosted by the Mayo Clinic, at which numerous social interventions were proposed, including calorie labeling in workplace cafeterias and mandatory gym class for children of all grades.
  • he message at their core, that soda was a form of poison like tobacco, spread. In San Francisco and New York, public-service campaigns showed images of soda bottles pouring out a stream of glistening, blood-streaked fat. Michelle Obama led an effort to depict water—plain old water—as something “cool” to drink.
  • Soon, the federal government took up many of the ideas that Brownell had helped popularize. Barack Obama had promised while campaigning for president that if America’s obesity trends could be reversed, the Medicare system alone would save “a trillion dollars.” By fighting fat, he implied, his ambitious plan for health-care reform would pay for itself. Once he was in office, his administration pulled every policy lever it could.
  • Michelle Obama helped guide these efforts, working with marketing experts to develop ways of nudging kids toward better diets and pledging to eliminate “food deserts,” or neighborhoods that lacked convenient access to healthy, affordable food. She was relentless in her public messaging; she planted an organic garden at the White House and promoted her signature “Let’s Move!” campaign around the country.
  • An all-out war on soda would come to stand in for these broad efforts. Nutrition studies found that half of all Americans were drinking sugar-sweetened beverages every day, and that consumption of these accounted for one-third of the added sugar in adults’ diets. Studies turned up links between people’s soft-drink consumption and their risks for type 2 diabetes and obesity. A new strand of research hinted that “liquid calories” in particular were dangerous to health.
  • when their field lost faith in low-calorie diets as a source of lasting weight loss, the two friends went in opposite directions. Wadden looked for ways to fix a person’s chemistry, so he turned to pharmaceuticals. Brownell had come to see obesity as a product of our toxic food environment: He meant to fix the world to which a person’s chemistry responded, so he started getting into policy.
  • The social engineering worked. Slowly but surely, Americans’ lamented lifestyle began to shift. From 2001 to 2018, added-sugar intake dropped by about one-fifth among children, teens, and young adults. From the late 1970s through the early 2000s, the obesity rate among American children had roughly tripled; then, suddenly, it flattened out.
  • although the obesity rate among adults was still increasing, its climb seemed slower than before. Americans’ long-standing tendency to eat ever-bigger portions also seemed to be abating.
  • sugary drinks—liquid candy, pretty much—were always going to be a soft target for the nanny state. Fixing the food environment in deeper ways proved much harder. “The tobacco playbook pretty much only works for soda, because that’s the closest analogy we have as a food item,
  • that tobacco playbook doesn’t work to increase consumption of fruits and vegetables, he said. It doesn’t work to increase consumption of beans. It doesn’t work to make people eat more nuts or seeds or extra-virgin olive oil.
  • Careful research in the past decade has shown that many of the Obama-era social fixes did little to alter behavior or improve our health. Putting calorie labels on menus seemed to prompt at most a small decline in the amount of food people ate. Employer-based wellness programs (which are still offered by 80 percent of large companies) were shown to have zero tangible effects. Health-care spending, in general, kept going up.
  • From the mid-1990s to the mid-2000s, the proportion of adults who said they’d experienced discrimination on account of their height or weight increased by two-thirds, going up to 12 percent. Puhl and others started citing evidence that this form of discrimination wasn’t merely a source of psychic harm, but also of obesity itself. Studies found that the experience of weight discrimination is associated with overeating, and with the risk of weight gain over time.
  • obesity rates resumed their ascent. Today, 20 percent of American children have obesity. For all the policy nudges and the sensible revisions to nutrition standards, food companies remain as unfettered as they were in the 1990s, Kelly Brownell told me. “Is there anything the industry can’t do now that it was doing then?” he asked. “The answer really is no. And so we have a very predictable set of outcomes.”
  • she started to rebound. The openings into her gastric pouch—the section of her stomach that wasn’t bypassed—stretched back to something like their former size. And Barb found ways to “eat around” the surgery, as doctors say, by taking food throughout the day in smaller portions
  • Bariatric surgeries can be highly effective for some people and nearly useless for others. Long-term studies have found that 30 percent of those who receive the same procedure Barb did regain at least one-quarter of what they lost within two years of reaching their weight nadir; more than half regain that much within five years.
  • if the effects of Barb’s surgery were quickly wearing off, its side effects were not: She now had iron, calcium, and B12 deficiencies resulting from the changes to her gut. She looked into getting a revision of the surgery—a redo, more or less—but insurance wouldn’t cover it
  • She found that every health concern she brought to doctors might be taken as a referendum, in some way, on her body size. “If I stubbed my toe or whatever, they’d just say ‘Lose weight.’ ” She began to notice all the times she’d be in a waiting room and find that every chair had arms. She realized that if she was having a surgical procedure, she’d need to buy herself a plus-size gown—or else submit to being covered with a bedsheet when the nurses realized that nothing else would fit.
  • Barb grew angrier and more direct about her needs—You’ll have to find me a different chair, she started saying to receptionists. Many others shared her rage. Activists had long decried the cruel treatment of people with obesity: The National Association to Advance Fat Acceptance had existed, for example, in one form or another, since 1969; the Council on Size & Weight Discrimination had been incorporated in 1991. But in the early 2000s, the ideas behind this movement began to wend their way deeper into academia, and they soon gained some purchase with the public.
  • “Our public-health efforts to address obesity have failed,” Eric Finkelstein, the economist, told me.
  • Others attacked the very premise of a “healthy weight”: People do not have any fundamental need, they argued, morally or medically, to strive for smaller bodies as an end in itself. They called for resistance to the ideology of anti-fatness, with its profit-making arms in health care and consumer goods. The Association for Size Diversity and Health formed in 2003; a year later, dozens of scholars working on weight-related topics joined together to create the academic field of fat studies.
  • As the size-diversity movement grew, its values were taken up—or co-opted—by Big Business. Dove had recently launched its “Campaign for Real Beauty,” which included plus-size women. (Ad Age later named it the best ad campaign of the 21st century.) People started talking about “fat shaming” as something to avoid
  • By 2001, Bacon, who uses they/them pronouns, had received their Ph.D. and finished a rough draft of a book, Health at Every Size, which drew inspiration from a broader movement by that name among health-care practitioners
  • But something shifted in the ensuing years. In 2007, Bacon got a different response, and the book was published. Health at Every Size became a point of entry for a generation of young activists and, for a time, helped shape Americans’ understanding of obesity.
  • Some experts were rethinking their advice on food and diet. At UC Davis, a physiologist named Lindo Bacon who had struggled to overcome an eating disorder had been studying the effects of “intuitive eating,” which aims to promote healthy, sustainable behavior without fixating on what you weigh or how you look
  • The heightened sensitivity started showing up in survey data, too. In 2010, fewer than half of U.S. adults expressed support for giving people with obesity the same legal protections from discrimination offered to people with disabilities. In 2015, that rate had risen to three-quarters.
  • In Bacon’s view, the 2000s and 2010s were glory years. “People came together and they realized that they’re not alone, and they can start to be critical of the ideas that they’ve been taught,” Bacon told me. “We were on this marvelous path of gaining more credibility for the whole Health at Every Size movement, and more awareness.”
  • that sense of unity proved short-lived; the movement soon began to splinter. Black women have the highest rates of obesity, and disproportionately high rates of associated health conditions. Yet according to Fatima Cody Stanford, an obesity-medicine physician at Harvard Medical School, Black patients with obesity get lower-quality care than white patients with obesity.
  • That system was exactly what Bacon and the Health at Every Size movement had set out to reform. The problem, as they saw it, was not so much that Black people lacked access to obesity medicine, but that, as Bacon and the Black sociologist Sabrina Strings argued in a 2020 article, Black women have been “specifically targeted” for weight loss, which Bacon and Strings saw as a form of racism
  • But members of the fat-acceptance movement pointed out that their own most visible leaders, including Bacon, were overwhelmingly white. “White female dietitians have helped steal and monetize the body positive movement,” Marquisele Mercedes, a Black activist and public-health Ph.D. student, wrote in September 2020. “And I’m sick of it.”
  • Tensions over who had the standing to speak, and on which topics, boiled over. In 2022, following allegations that Bacon had been exploitative and condescending toward Black colleagues, the Association for Size Diversity and Health expelled them from its ranks and barred them from attending its events.
  • As the movement succumbed to in-fighting, its momentum with the public stalled. If attitudes about fatness among the general public had changed during the 2000s and 2010s, it was only to a point. The idea that some people can indeed be “fit but fat,” though backed up by research, has always been a tough sell.
  • Although Americans had become less inclined to say they valued thinness, measures of their implicit attitudes seemed fairly stable. Outside of a few cities such as San Francisco and Madison, Wisconsin, new body-size-discrimination laws were never passed.
  • In the meantime, thinness was coming back into fashion
  • In the spring of 2022, Kim Kardashian—whose “curvy” physique has been a media and popular obsession—boasted about crash-dieting in advance of the Met Gala. A year later, the model and influencer Felicity Hayward warned Vogue Business that “plus-size representation has gone backwards.” In March of this year, the singer Lizzo, whose body pride has long been central to her public persona, told The New York Times that she’s been trying to lose weight. “I’m not going to lie and say I love my body every day,” she said.
  • Among the many other dramatic effects of the GLP-1 drugs, they may well have released a store of pent-up social pressure to lose weight.
  • If ever there was a time to debate that impulse, and to question its origins and effects, it would be now. But Puhl told me that no one can even agree on which words are inoffensive. The medical field still uses obesity, as a description of a diagnosable disease. But many activists despise that phrase—some spell it with an asterisk in place of the e—and propose instead to reclaim fat.
  • Everyone seems to agree on the most important, central fact: that we should be doing everything we can to limit weight stigma. But that hasn’t been enough to stop the arguing.
  • Things feel surreal these days to just about anyone who has spent years thinking about obesity. At 71, after more than four decades in the field, Thomas Wadden now works part-time, seeing patients just a few days a week. But the arrival of the GLP-1 drugs has kept him hanging on for a few more years, he said. “It’s too much of an exciting period to leave obesity research right now.”
  • When everyone is on semaglutide or tirzepatide, will the soft-drink companies—Brownell’s nemeses for so many years—feel as if a burden has been lifted? “My guess is the food industry is probably really happy to see these drugs come along,” he said. They’ll find a way to reach the people who are taking GLP‑1s, with foods and beverages in smaller portions, maybe. At the same time, the pressures to cut back on where and how they sell their products will abate.
  • the triumph in obesity treatment only highlights the abiding mystery of why Americans are still getting fatter, even now
  • Perhaps one can lay the blame on “ultraprocessed” foods, he said. Maybe it’s a related problem with our microbiomes. Or it could be that obesity, once it takes hold within a population, tends to reproduce itself through interactions between a mother and a fetus. Others have pointed to increasing screen time, how much sleep we get, which chemicals are in the products that we use, and which pills we happen to take for our many other maladies.
  • “The GLP-1s are just a perfect example of how poorly we understand obesity,” Mozaffarian told me. “Any explanation of why they cause weight loss is all post-hoc hand-waving now, because we have no idea. We have no idea why they really work and people are losing weight.”
  • The new drugs—and the “new understanding of obesity” that they have supposedly occasioned—could end up changing people’s attitudes toward body size. But in what ways
  • When the American Medical Association declared obesity a disease in 2013, Rebecca Puhl told me, some thought “it might reduce stigma, because it was putting more emphasis on the uncontrollable factors that contribute to obesity.” Others guessed that it would do the opposite, because no one likes to be “diseased.”
  • why wasn’t there another kind of nagging voice that wouldn’t stop—a sense of worry over what the future holds? And if she wasn’t worried for herself, then what about for Meghann or for Tristan, who are barely in their 40s? Wouldn’t they be on these drugs for another 40 years, or even longer? But Barb said she wasn’t worried—not at all. “The technology is so much better now.” If any problems come up, the scientists will find solutions.
Javier E

Health insurance whistleblower: I lied to Americans about Canadian medicine - The Washi... - 0 views

  • In my prior life as an insurance executive, it was my job to deceive Americans about their health care. I misled people to protect profits
  • That work contributed directly to a climate in which fewer people are insured, which has shaped our nation’s struggle against the coronavirus, a condition that we can fight only if everyone is willing and able to get medical treatment. Had spokesmen like me not been paid to obscure important truths about the differences between the U.S. and Canadian health-care systems, tens of thousands of Americans who have died during the pandemic might still be alive.
  • In 2007, I was working as vice president of corporate communications for Cigna.
  • ...20 more annotations...
  • I spent much of that year as an industry spokesman, my last after 20 years in the business, spreading AHIP’s “information” to journalists and lawmakers to create the impression that our health-care system was far superior to Canada’s, which we wanted people to believe was on the verge of collapse.
  • The campaign worked. Stories began to appear in the press that cast the Canadian system in a negative light. And when Democrats began writing what would become the Affordable Care Act in early 2009, they gave no serious consideration to a publicly financed system like Canada’s.
  • Today, the respective responses of Canada and the United States to the coronavirus pandemic prove just how false the ideas I helped spread were.
  • There are more than three times as many coronavirus infections per capita in the United States, and the mortality rate is twice the rate in Canada.
  • The most effective myth we perpetuated — the industry trots it out whenever major reform is proposed — is that Canadians and people in other single-payer countries have to endure long waits for needed care.
  • While it’s true that Canadians sometimes have to wait weeks or months for elective procedures (knee replacements are often cited), the truth is that they do not have to wait at all for the vast majority of medical services.
  • And, contrary to another myth I used to peddle — that Canadian doctors are flocking to the United States — there are more doctors per 1,000 people in Canada than here. Canadians see their doctors an average of 6.8 times a year, compared with just four times a year in this country.
  • Most important, no one in Canada is turned away from doctors because of a lack of funds, and Canadians can get tested and treated for the coronavirus without fear of receiving a budget-busting medical bill.
  • In America, exorbitant bills are a defining feature of our health-care system. Despite the assurances from President Trump and members of Congress that covid-19 patients will not be charged for testing or treatment, they are on the hook for big bills, according to numerous reports.
  • That is not the case in Canada, where there are no co-pays, deductibles or coinsurance for covered benefits. Care is free at the point of service. And those laid off in Canada don’t face the worry of losing their health insurance. In the United States, by contrast, more than 40 million have lost their jobs during this pandemic, and millions of them — along with their families — also lost their coverage.
  • Then there’s quality of care. By numerous measures, it is better in Canada. Some examples: Canada has far lower rates than the United States of hospitalizations from preventable causes like diabetes (almost twice as common here) and hypertension (more than eight times as common).
  • And even though Canada spends less than half what we do per capita on health care, life expectancy there is 82 years, compared with 78.6 years in the United States.
  • Of the many regrets I have about what I once did for a living, one of the biggest is slandering Canada’s health-care system. If the United States had undertaken a different kind of reform in 2009 (or anytime since), one that didn’t rely on private insurance companies that have every incentive to limit what they pay for, we’d be a healthier country today.
  • Living without insurance dramatically increases your chances of dying unnecessarily. Over the past 13 years, tens of thousands of Americans have probably died prematurely because, unlike our neighbors to the north, they either had no coverage or were so inadequately insured that they couldn’t afford the care they needed. I live with that horror, and my role in it, every day.
  • here were more specific reasons to be skeptical of those claims. We didn’t know, for example, who conducted that 2004 survey or anything about the sample size or methodology — or even what criteria were used to determine who qualified as a “business leader.” We didn’t know if the assertion about imaging equipment was based on reliable data or was an opinion. You could easily turn up comparable complaints about outdated equipment at U.S. hospitals.
  • Another bullet point, from the same book, quoted the CEO of the Canadian Association of Radiologists as saying that “the radiology equipment in Canada is so bad that ‘without immediate action radiologists will no longer be able to guarantee the reliability and quality of examinations.’ ”
  • Here’s an example from one AHIP brief in the binder: “A May 2004 poll found that 87% of Canada’s business leaders would support seeking health care outside the government system if they had a pressing medical concern.” The source was a 2004 book by Sally Pipes, president of the industry-supported Pacific Research Institute,
  • We enlisted APCO Worldwide, a giant PR firm. Agents there worked with AHIP to put together a binder of laminated talking points for company flacks like me to use in news releases and statements to reporters.
  • Clearly my colleagues and I would need a robust defense. On a task force for the industry’s biggest trade association, America’s Health Insurance Plans (AHIP), we talked about how we might make health-care systems in Canada, France, Britain and even Cuba look just as bad as ours.
  • That summer, Michael Moore was preparing to release his latest documentary, “Sicko,” contrasting American health care with that in other rich countries. (Naturally, we looked terrible.) I spent months meeting secretly with my counterparts at other big insurers to plot our assault on the film, which contained many anecdotes about patients who had been denied coverage for important treatments.
Javier E

Fear of covid-19 exposes lack of health literacy - The Washington Post - 0 views

  • Fear of covid-19 is exposing a lack of health literacy in this country that is not new. The confusion is amplified during a health emergency, however, by half-truths swirling in social media and misinformed statements by people in the public eye.
  • One in five people struggle with health informatio
  • The people most likely to have low health literacy include those dying in greater numbers from covid-19: older adults, racial and ethnic minorities, nonnative English speakers, and people with low income and education levels.
  • ...16 more annotations...
  • “It’s easy to misunderstand [medical information],” says Wolf, who is also founding director of the medical school’s Health Literacy and Learning Program. Some will be too ashamed to say so while others won’t realize they missed a critical detail
  • Magnani has patients who don’t believe they have high blood pressure because their lives aren’t stressful. Or respond with “Great news!” when he tells them a test result was “positive.”
  • But low health literacy cuts across all demographics
  • “Given the right headache or stress about a sick child, [gaps in comprehension] can happen to anyone. When you don’t feel well, you don’t think as clearly.”
  • Health literacy is not about reading skills or having a college degree. It means you know how to ask a doctor the right questions, read a food label, understand what you’re signing on a consent form, and have the numeric ability to analyze relative risks when making treatment decisions.
  • “None of this is intuitive,”
  • “Covid has brought to fore the vast inequities in society,” says cardiologist Jared W. Magnani, associate professor of medicine at the University of Pittsburgh School of Medicine. If you don’t understand words such as “immunocompromised” or “comorbidity,” for instance, you miss cautionary information that could save your life.
  • Misunderstandings over hospital discharge or medication instructions can undo the best medical care. Yet, nearly 1 in 3 of the 17,309 people in a study by researchers from the Agency for Healthcare Research and Quality (AHRQ) responded that instructions from a health provider were “not easy to understand.”
  • Wolf says he was surprised during a study on reading prescription labels by how many high school graduates could not follow medication instructions. “Being able to read the label doesn’t mean you can interpret it,”
  • “Take two pills, twice daily” was frequently misunderstood. Replacing the awkward wording with “Take two in the morning and two at bedtime” would solve that, Wolf says. Health-care professionals “need to meet people where they’re at.”
  • Health literacy is the best predictor of someone’s health status, some physicians maintain. Decades of research consistently link low health literacy to poorer medical outcomes, more hospitalizations and emergency room visits, and higher health-care costs
  • Anatomy knowledge is another gap.
  • Explaining medical risk and probability is another challenge.
  • Over 3,000 studies found that health education materials far exceed the eighth-grade reading level of the average American, too. Beyond not using plain language (“joint pain,” not “arthritis”), texts assume the patient knows more than they do. Telling people to sanitize surfaces to kill the coronavirus means little if you don’t tell them what to use and how to do it, Caballero says. “What does it mean to practice good respiratory hygiene?” she asks. “These are not actionable instructions.”
  • Doctors are encouraged to employ the teach-back technique, meaning the doctor asks the patient to repeat what they’ve heard rather than simply asking, “Do you understand?”
  • In addition, “health care is becoming a harder test,” Wolf says. Health billing and insurance options can be impossible to navigate. We have an aging population with more chronic conditions and cognitive decline. And more is being asked of patients such as testing their own blood sugar or blood pressure.
anonymous

Opinion | The Coronavirus Has Laid Bare the Inequality of America's Health Care - The N... - 0 views

  • The notion of price control is anathema to health care companies. It threatens their basic business model, in which the government grants them approvals and patents, pays whatever they ask, and works hand in hand with them as they deliver the worst health outcomes at the highest costs in the rich world.
  • The American health care industry is not good at promoting health, but it excels at taking money from all of us for its benefit. It is an engine of inequality.
  • the virus also provides an opportunity for systemic change. The United States spends more than any other nation on health care, and yet we have the lowest life expectancy among rich countries. And although perhaps no system can prepare for such an event, we were no better prepared for the pandemic than countries that spend far less.
  • ...25 more annotations...
  • One way or another, everyone pays for health care. It accounts for about 18 percent of G.D.P. — nearly $11,000 per person. Individuals directly pay about a quarter, the federal and state governments pay nearly half, and most of the rest is paid by employers.
  • Many Americans think their health insurance is a gift from their employers — a “benefit” bestowed on lucky workers by benevolent corporations. It would be more accurate to think of employer-provided health insurance as a tax.
  • Rising health care costs account for much of the half-century decline in the earnings of men without a college degree, and contribute to the decline in the number of less-skilled jobs.
  • Employer-based health insurance is a wrecking ball, destroying the labor market for less-educated workers and contributing to the rise in “deaths of despair.”
  • We face a looming trillion-dollar federal deficit caused almost entirely by the rising costs of Medicaid and Medicare, even without the recent coronavirus relief bill.
  • Rising costs are an untenable burden on our government, too. States’ payments for Medicaid have risen from 20.5 percent of their spending in 2008 to 28.9 percent in 2019. To meet those rising costs, states have cut their financing for roads, bridges and state universities. Without those crucial investments, the path to success for many Americans is cut off
  • Every year, the United States spends $1 trillion more than is needed for high quality care.
  • executives at hospitals, medical device makers and pharmaceutical companies, and some physicians, are very well paid.
  • American doctors control access to their profession through a system that limits medical school admissions and the entry of doctors trained abroad — an imbalance that was clear even before the pandemic
  • Hospitals, many of them classified as nonprofits, have consolidated, with monopolies over health care in many cities, and they have used that monopoly power to raise prices
  • These are all strategies that lawmakers and regulators could put a stop to, if they choose.
  • The health care industry has armored itself, employing five lobbyists for each elected member of Congress. But public anger has been building — over drug prices, co-payments, surprise medical bills — and now, over the fragility of our health care system, which has been laid bare by the pandemic
  • A single-payer system is just one possibility. There are many systems in wealthy countries to choose from, with and without insurance companies, with and without government-run hospitals. But all have two key characteristics: universal coverage — ideally from birth — and cost control.
  • In the United States, public funding is likely to play a significant role in any treatments or vaccines that are eventually developed for Covid-19. Americans should demand that they be available at a reasonable price to everyone — not in the sole interest of drug companies.
  • We are believers in free-market capitalism, but health care is not something it can deliver in a socially tolerable way.
  • They choose not to. And so we Americans have too few doctors, too few beds and too few ventilators — but lots of income for providers
  • America is a rich country that can afford a world-class health care system. We should be spending a lot of money on care and on new drugs. But we need to spend to save lives and reduce sickness, not on expensive, income-generating procedures that do little to improve health. Or worst of all, on enriching pharma companies that feed the opioid epidemic.
  • Medical device manufacturers have also consolidated, in some cases using a “catch and kill” strategy to swallow up nimbler start-ups and keep the prices of their products high.
  • Ambulance services and emergency departments that don’t accept insurance have become favorites of private equity investors because of their high profits
  • Britain, for example, has the National Institute for Health and Care Excellence, which vets drugs, devices and procedures for their benefit relative to cost
  • At the very least, America must stop financing health care through employer-based insurance, which encourages some people to work but it eliminates jobs for less-skilled workers
  • Our system takes from the poor and working class to generate wealth for the already wealthy.
  • passed a coronavirus bill including $3.1 billion to develop and produce drugs and vaccines.
  • The industry might emerge as a superhero of the war against Covid-19, like the Royal Air Force in the Battle of Britain during World War II.
  • illions have lost their paychecks and their insurance
Javier E

States and experts begin pursuing a coronavirus national strategy in absence of White H... - 0 views

  • A national plan to fight the coronavirus pandemic in the United States and return Americans to jobs and classrooms is emerging — but not from the White House.
  • a collection of governors, former government officials, disease specialists and nonprofits are pursuing a strategy that relies on the three pillars of disease control:
  • Ramp up testing to identify people who are infected.
  • ...40 more annotations...
  • Find everyone they interact with by deploying contact tracing on a scale America has never attempted before.
  • focus restrictions more narrowly on the infected and their contacts so the rest of society doesn’t have to stay in permanent lockdown.
  • Instead, the president and his top advisers have fixated almost exclusively on plans to reopen the U.S. economy by the end of the month, though they haven’t detailed how they will do so without triggering another outbreak
  • Administration officials, speaking on the condition of anonymity to describe internal deliberations, say the White House has made a deliberate political calculation that it will better serve Trump’s interest to put the onus on governors — rather than the federal government — to figure out how to move ahead.
  • without substantial federal funding, states’ efforts will only go so far
  • The next failure is already on its way, Frieden said, because “we’re not doing the things we need to be doing in April.”
  • In recent days, dozens of leading voices have coalesced around the test-trace-quarantine framework, including former FDA commissioners for the Trump and George W. Bush administrations, Microsoft founder Bill Gates and top experts at Johns Hopkins, Columbia and Harvard universities.
  • On Wednesday, former president Barack Obama weighed in, tweeting, “Social distancing bends the curve and relieves some pressure … But in order to shift off current policies, the key will be a robust system of testing and monitoring — something we have yet to put in place nationwide.”
  • And Friday, Apple and Google unveiled a joint effort on new tools that would use smartphones to aid in contact tracing.
  • What remains unclear is whether this emerging plan can succeed without the backing of the federal government.
  • “It’s mind-boggling, actually, the degree of disorganization,” said Tom Frieden, former Centers for Disease Control and Prevention director. The federal government has already squandered February and March, he noted, committing “epic failures” on testing kits, ventilator supply, protective equipment for health workers and contradictory public health communication.
  • Experts and leaders in some states say remedying that weakness should be a priority and health departments should be rapidly shored up so that they are ready to act in coming weeks as infections nationwide begin to decrease
  • In America, testing — while still woefully behind — is ramping up. And households across the country have learned over the past month how to quarantine. But when it comes to the second pillar of the plan — the labor-intensive work of contact tracing — local health departments lack the necessary staff, money and training.
  • In South Korea, Taiwan, China and Singapore, variations on this basic strategy were implemented by their national governments, allowing them to keep the virus in check even as they reopened parts of their economy and society
  • In a report released Friday, the Johns Hopkins Center for Health Security and the Association of State and Territorial Health Officials — which represents state health departments — estimate 100,000 additional contact tracers are needed and call for $3.6 billion in emergency funding from Congress.
  • “We can’t afford to have multiple community outbreaks that can spiral up into sustained community transmission,” he said in the interview.
  • Unless states can aggressively trace and isolate the virus, experts say, there will be new outbreaks and another round of disruptive stay-at-home orders.
  • “All people are talking about right now is hospital beds, ventilators, testing, testing, testing. Yes, those are important, but they are all reactive. You are dealing with the symptoms and not the virus itself,”
  • The nonprofit Partners in Health quickly put together a plan to hire and train 1,000 contact tracers. Working from their homes making 20 to 30 calls a day, they could cover up to 20,000 contacts a day.
  • Testing on its own is useless, Nyenswah explained, because it only tells you who already has the virus. Similarly, tracing alone is useless if you don’t place those you find into quarantine. But when all three are implemented, the chain of transmission can be shattered.
  • Until a vaccine or treatment is developed, such nonpharmaceutical interventions are the only tools countries can rely on — besides locking down their cities.
  • to expand that in a country as large as the United States will require a massive dose of money, leadership and political will.
  • “You cannot have leaders contradicting each other every day. You cannot have states waiting on the federal government to act, and government telling the states to figure it out on their own,” he said. “You need a plan.”
  • When Vermont’s first coronavirus case was detected last month, it took two state health workers a day to track down 13 people who came into contact with that single patient. They put them under quarantine and started monitoring for symptoms. No one else became sick.
  • He did the math: If each of those 30 patients had contact with even three people, that meant 90 people his crew would have to locate and get into quarantine. In other words, impossible.
  • Since 2008, city and county health agencies have lost almost a quarter of their overall workforce. Decades of budget cuts have left the them unable to mount such a response. State health departments have recently had to lay off thousands more — an unintended consequence of federal officials delaying tax filings until July without warning states.
  • In Wuhan, a city of 11 million, the Chinese had 9,000 health workers doing contact tracing, said Frieden, the former CDC director. He estimates authorities would need roughly one contact tracer for every four cases in the United States.
  • “In the second wave, we have to have testing, a resource base, and a contact-tracing base that is so much more scaled up than right now,” he said. “It’s an enormous challenge.”
  • Gov. Charlie Baker (R) partnered with an international nonprofit group based in Boston
  • “You will never beat a virus like this one unless you get ahead of it. America must not just flatten the curve but get ahead of the curve.”
  • The group is paying new hires roughly the same salary as census takers, more than $20 an hour. As of Tuesday — just four days after the initial announcement — the group had received 7,000 applicants and hired 150.
  • “There’s a huge untapped resource of people in America if we would just ask.”
  • “There needs to be a crash course in contact tracing because a lot of the health departments where this is going to need to happen are already kind of flat-out just trying to respond to the crisis at hand,”
  • Experts have proposed transforming the Peace Corps — which suspended global operations last month and recalled 7,000 volunteers to America — into a national response corps that could perform many tasks, including contact tracing.
  • On Wednesday, the editor in chief of JAMA, a leading medical journal, proposed suspending the first year of training for America’s 20,000 incoming medical students and deploying them as a medical corps to support the “test, trace, track, and quarantine strategy.”
  • The national organization for local STD programs says $200 million could add roughly 1,850 specialists, more than doubling that current workforce.
  • Technology could also turn out to be pivotal. But the invasive nature of cellphone tracking and apps raises concerns about civil liberties.
  • Such technology could take over some of what contact tracers do in interviews: build a contact history for each confirmed patient and find those possibly exposed. Doing that digitally could speed up the process — critical in containing an outbreak — and less laborious.
  • In China, authorities combined the nation’s vast surveillance apparatus with apps and cellphone data to track people’s movements. If someone they came across is later confirmed as infected, an app alerts them to stay at home.
  • In the United States, about 20 technology companies are trying to create a contact tracing app using geolocation data or Bluetooth pings on cellphones
Javier E

Colonoscopies Explain Why U.S. Leads the World in Health Expenditures - NYTimes.com - 0 views

  • In many other developed countries, a basic colonoscopy costs just a few hundred dollars and certainly well under $1,000. That chasm in price helps explain why the United States is far and away the world leader in medical spending, even though numerous studies have concluded that Americans do not get better care.
  • Whether directly from their wallets or through insurance policies, Americans pay more for almost every interaction with the medical system. They are typically prescribed more expensive procedures and tests than people in other countries, no matter if those nations operate a private or national health system. A list of drug, scan and procedure prices compiled by the International Federation of Health Plans, a global network of health insurers, found that the United States came out the most costly in all 21 categories — and often by a huge margin.
  • “The U.S. just pays providers of health care much more for everything,” said Tom Sackville, chief executive of the health plans federation and a former British health minister.
  • ...12 more annotations...
  • Largely an office procedure when widespread screening was first recommended, colonoscopies have moved into surgery centers — which were created as a step down from costly hospital care but are now often a lucrative step up from doctors’ examining rooms — where they are billed like a quasi operation.
  • The high price paid for colonoscopies mostly results not from top-notch patient care, according to interviews with health care experts and economists, but from business plans seeking to maximize revenue; haggling between hospitals and insurers that have no relation to the actual costs of performing the procedure; and lobbying, marketing and turf battles among specialists that increase patient fees.
  • While several cheaper and less invasive tests to screen for colon cancer are recommended as equally effective by the federal government’s expert panel on preventive care — and are commonly used in other countries — colonoscopy has become the go-to procedure in the United States. “We’ve defaulted to by far the most expensive option, without much if any data to support it,”
  • Hospitals, drug companies, device makers, physicians and other providers can benefit by charging inflated prices, favoring the most costly treatment options and curbing competition that could give patients more, and cheaper, choices. And almost every interaction can be an opportunity to send multiple, often opaque bills with long lists of charges: $100 for the ice pack applied for 10 minutes after a physical therapy session, or $30,000 for the artificial joint implanted in surgery.
  • Even doctors often do not know the costs of the tests and procedures they prescribe. When Dr. Michael Collins, an internist in East Hartford, Conn., called the hospital that he is affiliated with to price lab tests and a colonoscopy, he could not get an answer. “It’s impossible for me to think about cost,” he said
  • The more than $35,000 annually that Ms. Yapalater and her employer collectively pay in premiums — her share is $15,000 — for her family’s Oxford Freedom Plan would be more than sufficient to cover their medical needs in most other countries. She and her husband, Jeff, 63, a sales and marketing consultant, have three children in their 20s with good jobs. Everyone in the family exercises, and none has had a serious illness.
  • A major factor behind the high costs is that the United States, unique among industrialized nations, does not generally regulate or intervene in medical pricing, aside from setting payment rates for Medicare and Medicaid, the government programs for older people and the poor. Many other countries deliver health care on a private fee-for-service basis, as does much of the American health care system, but they set rates as if health care were a public utility or negotiate fees with providers and insurers nationwide, for example.
  • “In the U.S., we like to consider health care a free market,” said Dr. David Blumenthal, president of the Commonwealth Fund and a former adviser to President Obama. ”But it is a very weird market, riddled with market failures.”
  • Consumers, the patients, do not see prices until after a service is provided, if they see them at all. And there is little quality data on hospitals and doctors to help determine good value, aside from surveys conducted by popular Web sites and magazines. Patients with insurance pay a tiny fraction of the bill, providing scant disincentive for spending.
  • The United States spends about 18 percent of its gross domestic product on health care, nearly twice as much as most other developed countries. The Congressional Budget Office has said that if medical costs continue to grow unabated, “total spending on health care would eventually account for all of the country’s economic output.”
  • Instead, payments are often determined in countless negotiations between a doctor, hospital or pharmacy, and an insurer, with the result often depending on their relative negotiating power. Insurers have limited incentive to bargain forcefully, since they can raise premiums to cover costs.
  • “People think it’s like other purchases: that if you pay more you get a better car. But in medicine, it’s not like that.”
brickol

'It's a Leadership Argument': Coronavirus Reshapes Health Care Fight - The New York Times - 0 views

  • Democrats were already talking about health care before the coronavirus, but the outbreak gives new urgency to a central issue for the party.
  • The future of America’s health insurance system has already been a huge part of the 2020 presidential race. At campaign events over the past year, voters have shared stories of cancer diagnoses, costly medications and crushing medical debt.
  • “Health care was always going to be a big issue in the general election, and the coronavirus epidemic will put health care even more top of mind for voters,”
  • ...10 more annotations...
  • That was before more than 68,000 people in the United States tested positive for the coronavirus, grinding the country to a halt, upending lives from coast to coast, and postponing primary elections in many states. The virus has made the stakes, and the differing visions the two parties have for health care in America, that much clearer.
  • “Sometimes these health care debates can get a bit abstract, but when it’s an immediate threat to the health of you and your family, it becomes a lot more real.”
  • While the Democrats spent much of their primary fighting about whether to push for “Medicare for all” or build on the Affordable Care Act, the coronavirus crisis may streamline the debate to their advantage: At a time when the issue of health care is as pressing as ever, they can present themselves as the party that wants people to have sufficient coverage while arguing that the Republicans do not.
  • “A crisis like the coronavirus epidemic highlights the stake that everyone has in the care of the sick,” said Paul Starr, a professor of sociology and public affairs at Princeton who served as a health policy adviser in the Clinton White House. “It really strengthens the Democratic case for expanded health coverage, and that should work, I should think, to Biden’s advantage in a campaign against Trump.”
  • The virus is also having dire economic consequences, depriving Mr. Trump of a potent re-election argument rooted in stock market gains and low unemployment numbers. It is testing Mr. Trump’s leadership in the face of a national emergency like nothing he has encountered, and if voters give him poor marks, that could inflict lasting damage on his chances in November’s general election.
  • Four years ago, Mr. Trump ran for president promising to repeal the Affordable Care Act, popularly known as Obamacare. But his campaign pledge quickly turned into a debacle in the first year of his presidency when Republicans struggled and ultimately failed to repeal and replace the health law. In the midterm elections the next year, Democrats emphasized health care, highlighting issues like preserving protections for people with pre-existing conditions, and they won control of the House.
  • Mr. Trump is particularly vulnerable on the issue of health care. Over the course of his presidency, his administration has repeatedly taken steps to undermine the Affordable Care Act, including by arguing in court that the entire law should be invalidated. The Supreme Court agreed this month to hear an appeal in that case, which is the latest major challenge to the law. The court is not expected to rule until next year, but Democrats point to the Trump administration’s legal position as yet another example of the president’s desire to shred the Affordable Care Act.
  • In his campaign, Mr. Biden has already put a focus on health care, promising to build on the Affordable Care Act and create a so-called public option, an optional government plan that consumers could purchase. On the campaign trail, he has talked about his own exposure to the health care system
  • In the Democratic primary race, the health care debate has largely focused on the divide between moderate-leaning Democrats looking to build on the Affordable Care Act and progressives calling for Medicare for all, a government-run health insurance program. Mr. Biden and Mr. Sanders represent the two sides of that argument.
  • In a poll this month by Morning Consult, four in 10 Americans said the coronavirus outbreak had made them more likely to support universal health care proposals in which everyone would receive their health insurance from the government.
Javier E

America has no real public health system - coronavirus has a clear run | Robert Reich |... - 0 views

  • Fauci, director of the National Institute of Allergy and Infectious Diseases and just about the only official in the Trump administration trusted to tell the truth about the coronavirus, said last Thursday: “The system does not, is not really geared to what we need right now … It is a failing, let’s admit it.”
  • The system would be failing even under a halfway competent president. The dirty little secret, which will soon become apparent to all, is that there is no real public health system in the United States.
  • America is waking up to the fact that it has almost no public capacity to deal with it.
  • ...12 more annotations...
  • Instead of a public health system, we have a private for-profit system for individuals lucky enough to afford it and a rickety social insurance system for people fortunate enough to have a full-time job.
  • At their best, both systems respond to the needs of individuals rather than the needs of the public as a whole
  • In America, the word “public” – as in public health, public education or public welfare – means a sum total of individual needs, not the common good.
  • Almost 30% of American workers have no paid sick leave from their employers, including 70% of low-income workers earning less than $10.49 an hour. Vast numbers of self-employed workers cannot afford sick leave. Friday’s deal between House Democrats and the White House won’t have much effect because it exempts large employers and offers waivers to smaller ones.
  • there are no institutions analogous to the Fed with responsibility for overseeing and managing the public’s health – able to whip out a giant checkbook at a moment’s notice to prevent human, rather than financial, devastation
  • Even if a test for the Covid-19 virus had been developed and approved in time, no institutions are in place to administer it to tens of millions of Americans free of charge.
  • Healthcare in America is delivered mainly by private for-profit corporations which, unlike financial institutions, are not required to maintain reserve capacity.
  • Its 45,000 intensive care unit beds fall woefully short of the 2.9 million likely to be needed.
  • Contrast this with America’s financial system. The Federal Reserve concerns itself with the health of financial markets as a whole. Late last week the Fed made $1.5tn available to banks, at the slightest hint of difficulties making trades. No one batted an eye.
  • more than 30 million Americans have no health insurance. Eligibility for Medicaid, food stamps and other public assistance is now linked to having or actively looking for work.
  • In Los Angeles, about 80% of students qualify for free or reduced lunches and just under 20,000 are homeless at some point during the school year.
  • here is no public health system in the US, in short, because the richest nation in the world has no capacity to protect the public as a whole, apart from national defense
Javier E

Opinion | When Public Health Loses the Public - The New York Times - 0 views

  • “Within Reason: A Liberal Public Health for an Illiberal Time,” Sandro Galea, the dean of the Boston University School of Public Health, looks to his own field to explain the animating forces behind some of those disputes.
  • Despite remarkable successes, Galea argues, public health succumbed to a disturbing strain of illiberalism during the pandemic. This not only worsened the impact of the pandemic; it also destabilized public health institutions in ways that will serve us poorly when the next crisis comes.
  • : If Americans have come to distrust public health advice, what role may public health officials have played in fostering that distrust?
  • ...6 more annotations...
  • American health experts advocated almost universal child vaccination; meanwhile, in Europe, experts cautioned against vaccinating young children, who were at low risk for serious illness, without more long-term data. “Were we pushing to vaccinate children for their sake or for ours?” Galea asks. “Were we doing it to support health or to make a political point?”
  • Scientists should have made more nuanced risk assessments and revisited them regularly. They should have taken into account the consequences and the disproportionate impact of strict lockdowns on lower-income workers and at-risk youth
  • This zero-sum mode of thinking — neglecting to take into account one’s own biases, succumbing to groupthink, operating according to the expectations of one’s “side,” discouraging good-faith debate — persisted even as the pandemic eased.
  • this tendency to view “core issues in Manichaean terms, with certain positions seen as on the side of good and others on the side of evil, with little gray area between,” as Galea puts it, has continued to inform public health postpandemic
  • It also undermines public faith in science, one of the few institutions that had maintained a high level of trust into the Trump era.
  • the percentage of Americans who believe science has a mostly positive effect on society dropped to 57 percent in 2023, from 67 percent in 2016. Those who say they have a great deal of confidence in scientists dropped to 23 percent, from 39 percent in 2020. And these declines took place among both Republicans and Democrats.
Javier E

Opinion | I Studied Five Countries' Health Care Systems. We Need to Get More Creative W... - 0 views

  • I’m convinced that the ability to get good, if not great, care in facilities that aren’t competing with one another is the main way that other countries obtain great outcomes for much less money. It also allows for more regulation and control to keep a lid on prices.
  • Because of government subsidies, most people spend less than 25 percent of their income on housing and can choose between buying new flats at highly subsidized prices or flats available for resale on an open market.
  • Other social determinants that matter include food security, access to education and even race. As part of New Zealand’s reforms, its Public Health Agency, which was established less than a year ago, specifically puts a “greater emphasis on equity and the wider determinants of health such as income, education and housing.” It also specifically seeks to address racism in health care, especially that which affects the Maori population.
  • ...9 more annotations...
  • When I asked about Australia’s rather impressive health outcomes, he said that while “Australia’s mortality that is amenable to, or influenced by, the health care system specifically is good, it’s not fundamentally better than that seen in peer O.E.C.D. countries, the U.S. excepted. Rather, Australia’s public health, social policy and living standards are more responsible for outcomes.”
  • Addressing these issues in the United States would require significant investment, to the tune of hundreds of billions or even trillions of dollars a year. That seems impossible until you remember that we spent more than $4.4 trillion on health care in 2022. We just don’t think of social policies like housing, food and education as health care.
  • Other countries, on the other hand, recognize that these issues are just as important, if not more so, than hospitals, drugs and doctors. Our narrow view too often defines health care as what you get when you’re sick, not what you might need to remain well.
  • When other countries choose to spend less on their health care systems (and it is a choice), they take the money they save and invest it in programs that benefit their citizens by improving social determinants of health
  • In the United States, conversely, we argue that the much less resourced programs we already have need to be cut further. The recent debt limit compromise reduces discretionary spending and makes it harder for people to access government programs like food stamps.
  • When I asked experts in each of these countries what might improve the areas where they are deficient (for instance, the N.H.S. has been struggling quite a bit as of late), they all replied the same way: more money. Some of them lack the political will to allocate those funds. Others can’t make major investments without drawing from other priorities.
  • Singapore will need to spend more, it’s very unlikely to go above the 8 percent to 10 percent of G.D.P. that pretty much all developed countries have historically spent.
  • That is, all of them except the United States. We currently spend about 18 percent of G.D.P. on health care. That’s almost $12,000 per American. It’s about twice what other countries currently spend.
  • We cannot seem to do what other countries think is easy, while we’ve happily decided to do what other countries think is impossible.But this is also what gives me hope. We’ve already decided to spend the money; we just need to spend it better.
Javier E

He Could Have Seen What Was Coming: Behind Trump's Failure on the Virus - The New York ... - 0 views

  • “Any way you cut it, this is going to be bad,” a senior medical adviser at the Department of Veterans Affairs, Dr. Carter Mecher, wrote on the night of Jan. 28, in an email to a group of public health experts scattered around the government and universities. “The projected size of the outbreak already seems hard to believe.”
  • A week after the first coronavirus case had been identified in the United States, and six long weeks before President Trump finally took aggressive action to confront the danger the nation was facing — a pandemic that is now forecast to take tens of thousands of American lives — Dr. Mecher was urging the upper ranks of the nation’s public health bureaucracy to wake up and prepare for the possibility of far more drastic action.
  • Throughout January, as Mr. Trump repeatedly played down the seriousness of the virus and focused on other issues, an array of figures inside his government — from top White House advisers to experts deep in the cabinet departments and intelligence agencies — identified the threat, sounded alarms and made clear the need for aggressive action.
  • ...68 more annotations...
  • The president, though, was slow to absorb the scale of the risk and to act accordingly, focusing instead on controlling the message, protecting gains in the economy and batting away warnings from senior officials.
  • Mr. Trump’s response was colored by his suspicion of and disdain for what he viewed as the “Deep State” — the very people in his government whose expertise and long experience might have guided him more quickly toward steps that would slow the virus, and likely save lives.
  • The slow start of that plan, on top of the well-documented failures to develop the nation’s testing capacity, left administration officials with almost no insight into how rapidly the virus was spreading. “We were flying the plane with no instruments,” one official said.
  • But dozens of interviews with current and former officials and a review of emails and other records revealed many previously unreported details and a fuller picture of the roots and extent of his halting response as the deadly virus spread:
  • The National Security Council office responsible for tracking pandemics received intelligence reports in early January predicting the spread of the virus to the United States, and within weeks was raising options like keeping Americans home from work and shutting down cities the size of Chicago. Mr. Trump would avoid such steps until March.
  • Despite Mr. Trump’s denial weeks later, he was told at the time about a Jan. 29 memo produced by his trade adviser, Peter Navarro, laying out in striking detail the potential risks of a coronavirus pandemic: as many as half a million deaths and trillions of dollars in economic losses.
  • The health and human services secretary, Alex M. Azar II, directly warned Mr. Trump of the possibility of a pandemic during a call on Jan. 30, the second warning he delivered to the president about the virus in two weeks. The president, who was on Air Force One while traveling for appearances in the Midwest, responded that Mr. Azar was being alarmist
  • Mr. Azar publicly announced in February that the government was establishing a “surveillance” system
  • the task force had gathered for a tabletop exercise — a real-time version of a full-scale war gaming of a flu pandemic the administration had run the previous year. That earlier exercise, also conducted by Mr. Kadlec and called “Crimson Contagion,” predicted 110 million infections, 7.7 million hospitalizations and 586,000 deaths following a hypothetical outbreak that started in China.
  • By the third week in February, the administration’s top public health experts concluded they should recommend to Mr. Trump a new approach that would include warning the American people of the risks and urging steps like social distancing and staying home from work.
  • But the White House focused instead on messaging and crucial additional weeks went by before their views were reluctantly accepted by the president — time when the virus spread largely unimpeded.
  • When Mr. Trump finally agreed in mid-March to recommend social distancing across the country, effectively bringing much of the economy to a halt, he seemed shellshocked and deflated to some of his closest associates. One described him as “subdued” and “baffled” by how the crisis had played out. An economy that he had wagered his re-election on was suddenly in shambles.
  • He only regained his swagger, the associate said, from conducting his daily White House briefings, at which he often seeks to rewrite the history of the past several months. He declared at one point that he “felt it was a pandemic long before it was called a pandemic,” and insisted at another that he had to be a “cheerleader for the country,” as if that explained why he failed to prepare the public for what was coming.
  • Mr. Trump’s allies and some administration officials say the criticism has been unfair.
  • The Chinese government misled other governments, they say. And they insist that the president was either not getting proper information, or the people around him weren’t conveying the urgency of the threat. In some cases, they argue, the specific officials he was hearing from had been discredited in his eyes, but once the right information got to him through other channels, he made the right calls.
  • “While the media and Democrats refused to seriously acknowledge this virus in January and February, President Trump took bold action to protect Americans and unleash the full power of the federal government to curb the spread of the virus, expand testing capacities and expedite vaccine development even when we had no true idea the level of transmission or asymptomatic spread,” said Judd Deere, a White House spokesman.
  • Decision-making was also complicated by a long-running dispute inside the administration over how to deal with China
  • The Containment IllusionBy the last week of February, it was clear to the administration’s public health team that schools and businesses in hot spots would have to close. But in the turbulence of the Trump White House, it took three more weeks to persuade the president that failure to act quickly to control the spread of the virus would have dire consequences.
  • There were key turning points along the way, opportunities for Mr. Trump to get ahead of the virus rather than just chase it. There were internal debates that presented him with stark choices, and moments when he could have chosen to ask deeper questions and learn more. How he handled them may shape his re-election campaign. They will certainly shape his legacy.
  • Facing the likelihood of a real pandemic, the group needed to decide when to abandon “containment” — the effort to keep the virus outside the U.S. and to isolate anyone who gets infected — and embrace “mitigation” to thwart the spread of the virus inside the country until a vaccine becomes available.
  • Among the questions on the agenda, which was reviewed by The New York Times, was when the department’s secretary, Mr. Azar, should recommend that Mr. Trump take textbook mitigation measures “such as school dismissals and cancellations of mass gatherings,” which had been identified as the next appropriate step in a Bush-era pandemic plan.
  • The group — including Dr. Anthony S. Fauci of the National Institutes of Health; Dr. Robert R. Redfield of the Centers for Disease Control and Prevention, and Mr. Azar, who at that stage was leading the White House Task Force — concluded they would soon need to move toward aggressive social distancing
  • A 20-year-old Chinese woman had infected five relatives with the virus even though she never displayed any symptoms herself. The implication was grave — apparently healthy people could be unknowingly spreading the virus — and supported the need to move quickly to mitigation.
  • The following day, Dr. Kadlec and the others decided to present Mr. Trump with a plan titled “Four Steps to Mitigation,” telling the president that they needed to begin preparing Americans for a step rarely taken in United States history.
  • a presidential blowup and internal turf fights would sidetrack such a move. The focus would shift to messaging and confident predictions of success rather than publicly calling for a shift to mitigation.
  • These final days of February, perhaps more than any other moment during his tenure in the White House, illustrated Mr. Trump’s inability or unwillingness to absorb warnings coming at him.
  • He instead reverted to his traditional political playbook in the midst of a public health calamity, squandering vital time as the coronavirus spread silently across the country.
  • A memo dated Feb. 14, prepared in coordination with the National Security Council and titled “U.S. Government Response to the 2019 Novel Coronavirus,” documented what more drastic measures would look like, including: “significantly limiting public gatherings and cancellation of almost all sporting events, performances, and public and private meetings that cannot be convened by phone. Consider school closures. Widespread ‘stay at home’ directives from public and private organizations with nearly 100% telework for some.”
  • his friend had a blunt message: You need to be ready. The virus, he warned, which originated in the city of Wuhan, was being transmitted by people who were showing no symptoms — an insight that American health officials had not yet accepted.
  • On the 18-hour plane ride home, Mr. Trump fumed as he watched the stock market crash after Dr. Messonnier’s comments. Furious, he called Mr. Azar when he landed at around 6 a.m. on Feb. 26, raging that Dr. Messonnier had scared people unnecessarily.
  • The meeting that evening with Mr. Trump to advocate social distancing was canceled, replaced by a news conference in which the president announced that the White House response would be put under the command of Vice President Mike Pence.
  • The push to convince Mr. Trump of the need for more assertive action stalled. With Mr. Pence and his staff in charge, the focus was clear: no more alarmist messages. Statements and media appearances by health officials like Dr. Fauci and Dr. Redfield would be coordinated through Mr. Pence’s office
  • It would be more than three weeks before Mr. Trump would announce serious social distancing efforts, a lost period during which the spread of the virus accelerated rapidly.Over nearly three weeks from Feb. 26 to March 16, the number of confirmed coronavirus cases in the United States grew from 15 to 4,226
  • The China FactorThe earliest warnings about coronavirus got caught in the crosscurrents of the administration’s internal disputes over China. It was the China hawks who pushed earliest for a travel ban. But their animosity toward China also undercut hopes for a more cooperative approach by the world’s two leading powers to a global crisis.
  • It was early January, and the call with a Hong Kong epidemiologist left Matthew Pottinger rattled.
  • Mr. Trump was walking up the steps of Air Force One to head home from India on Feb. 25 when Dr. Nancy Messonnier, the director of the National Center for Immunization and Respiratory Diseases, publicly issued the blunt warning they had all agreed was necessary.
  • It was one of the earliest warnings to the White House, and it echoed the intelligence reports making their way to the National Security Council
  • some of the more specialized corners of the intelligence world were producing sophisticated and chilling warnings.
  • In a report to the director of national intelligence, the State Department’s epidemiologist wrote in early January that the virus was likely to spread across the globe, and warned that the coronavirus could develop into a pandemic
  • Working independently, a small outpost of the Defense Intelligence Agency, the National Center for Medical Intelligence, came to the same conclusion.
  • By mid-January there was growing evidence of the virus spreading outside China. Mr. Pottinger began convening daily meetings about the coronavirus
  • The early alarms sounded by Mr. Pottinger and other China hawks were freighted with ideology — including a push to publicly blame China that critics in the administration say was a distraction
  • And they ran into opposition from Mr. Trump’s economic advisers, who worried a tough approach toward China could scuttle a trade deal that was a pillar of Mr. Trump’s re-election campaign.
  • Mr. Pottinger continued to believe the coronavirus problem was far worse than the Chinese were acknowledging. Inside the West Wing, the director of the Domestic Policy Council, Joe Grogan, also tried to sound alarms that the threat from China was growing.
  • The Consequences of ChaosThe chaotic culture of the Trump White House contributed to the crisis. A lack of planning and a failure to execute, combined with the president’s focus on the news cycle and his preference for following his gut rather than the data cost time, and perhaps lives.
  • the hawks kept pushing in February to take a critical stance toward China amid the growing crisis. Mr. Pottinger and others — including aides to Secretary of State Mike Pompeo — pressed for government statements to use the term “Wuhan Virus.”Mr. Pompeo tried to hammer the anti-China message at every turn, eventually even urging leaders of the Group of 7 industrialized countries to use “Wuhan virus” in a joint statement.
  • Others, including aides to Mr. Pence, resisted taking a hard public line, believing that angering Beijing might lead the Chinese government to withhold medical supplies, pharmaceuticals and any scientific research that might ultimately lead to a vaccine.
  • Mr. Trump took a conciliatory approach through the middle of March, praising the job Mr. Xi was doing.
  • That changed abruptly, when aides informed Mr. Trump that a Chinese Foreign Ministry spokesman had publicly spun a new conspiracy about the origins of Covid-19: that it was brought to China by U.S. Army personnel who visited the country last October.
  • On March 16, he wrote on Twitter that “the United States will be powerfully supporting those industries, like Airlines and others, that are particularly affected by the Chinese Virus.”
  • Mr. Trump’s decision to escalate the war of words undercut any remaining possibility of broad cooperation between the governments to address a global threat
  • Mr. Pottinger, backed by Mr. O’Brien, became one of the driving forces of a campaign in the final weeks of January to convince Mr. Trump to impose limits on travel from China
  • he circulated a memo on Jan. 29 urging Mr. Trump to impose the travel limits, arguing that failing to confront the outbreak aggressively could be catastrophic, leading to hundreds of thousands of deaths and trillions of dollars in economic losses.
  • The uninvited message could not have conflicted more with the president’s approach at the time of playing down the severity of the threat. And when aides raised it with Mr. Trump, he responded that he was unhappy that Mr. Navarro had put his warning in writing.
  • From the time the virus was first identified as a concern, the administration’s response was plagued by the rivalries and factionalism that routinely swirl around Mr. Trump and, along with the president’s impulsiveness, undercut decision making and policy development.
  • Even after Mr. Azar first briefed him about the potential seriousness of the virus during a phone call on Jan. 18 while the president was at his Mar-a-Lago resort in Florida, Mr. Trump projected confidence that it would be a passing problem.
  • “We have it totally under control,” he told an interviewer a few days later while attending the World Economic Forum in Switzerland. “It’s going to be just fine.”
  • The efforts to sort out policy behind closed doors were contentious and sometimes only loosely organized.
  • That was the case when the National Security Council convened a meeting on short notice on the afternoon of Jan. 27. The Situation Room was standing room only, packed with top White House advisers, low-level staffers, Mr. Trump’s social media guru, and several cabinet secretaries. There was no checklist about the preparations for a possible pandemic,
  • Instead, after a 20-minute description by Mr. Azar of his department’s capabilities, the meeting was jolted when Stephen E. Biegun, the newly installed deputy secretary of state, announced plans to issue a “level four” travel warning, strongly discouraging Americans from traveling to China. The room erupted into bickering.
  • A few days later, on the evening of Jan. 30, Mick Mulvaney, the acting White House chief of staff at the time, and Mr. Azar called Air Force One as the president was making the final decision to go ahead with the restrictions on China travel. Mr. Azar was blunt, warning that the virus could develop into a pandemic and arguing that China should be criticized for failing to be transparent.
  • Stop panicking, Mr. Trump told him.That sentiment was present throughout February, as the president’s top aides reached for a consistent message but took few concrete steps to prepare for the possibility of a major public health crisis.
  • As February gave way to March, the president continued to be surrounded by divided factions even as it became clearer that avoiding more aggressive steps was not tenable.
  • the virus was already multiplying across the country — and hospitals were at risk of buckling under the looming wave of severely ill people, lacking masks and other protective equipment, ventilators and sufficient intensive care beds. The question loomed over the president and his aides after weeks of stalling and inaction: What were they going to do?
  • Even then, and even by Trump White House standards, the debate over whether to shut down much of the country to slow the spread was especially fierce.
  • In a tense Oval Office meeting, when Mr. Mnuchin again stressed that the economy would be ravaged, Mr. O’Brien, the national security adviser, who had been worried about the virus for weeks, sounded exasperated as he told Mr. Mnuchin that the economy would be destroyed regardless if officials did nothing.
  • in the end, aides said, it was Dr. Deborah L. Birx, the veteran AIDS researcher who had joined the task force, who helped to persuade Mr. Trump. Soft-spoken and fond of the kind of charts and graphs Mr. Trump prefers, Dr. Birx did not have the rough edges that could irritate the president. He often told people he thought she was elegant.
  • During the last week in March, Kellyanne Conway, a senior White House adviser involved in task force meetings, gave voice to concerns other aides had. She warned Mr. Trump that his wished-for date of Easter to reopen the country likely couldn’t be accomplished. Among other things, she told him, he would end up being blamed by critics for every subsequent death caused by the virus.
1 - 20 of 1145 Next › Last »
Showing 20 items per page