Skip to main content

Home/ History Readings/ Group items tagged social networks

Rss Feed Group items tagged

Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

The Age of Social Media Is Ending - The Atlantic - 0 views

  • Slowly and without fanfare, around the end of the aughts, social media took its place. The change was almost invisible, but it had enormous consequences. Instead of facilitating the modest use of existing connections—largely for offline life (to organize a birthday party, say)—social software turned those connections into a latent broadcast channel. All at once, billions of people saw themselves as celebrities, pundits, and tastemakers.
  • A global broadcast network where anyone can say anything to anyone else as often as possible, and where such people have come to think they deserve such a capacity, or even that withholding it amounts to censorship or suppression—that’s just a terrible idea from the outset. And it’s a terrible idea that is entirely and completely bound up with the concept of social media itself: systems erected and used exclusively to deliver an endless stream of content.
  • “social media,” a name so familiar that it has ceased to bear meaning. But two decades ago, that term didn’t exist
  • ...35 more annotations...
  • Social media turned you, me, and everyone into broadcasters (if aspirational ones). The results have been disastrous but also highly pleasurable, not to mention massively profitable—a catastrophic combination.
  • As the original name suggested, social networking involved connecting, not publishing. By connecting your personal network of trusted contacts (or “strong ties,” as sociologists call them) to others’ such networks (via “weak ties”), you could surface a larger network of trusted contacts
  • The whole idea of social networks was networking: building or deepening relationships, mostly with people you knew. How and why that deepening happened was largely left to the users to decide.
  • That changed when social networking became social media around 2009, between the introduction of the smartphone and the launch of Instagram. Instead of connection—forging latent ties to people and organizations we would mostly ignore—social media offered platforms through which people could publish content as widely as possible, well beyond their networks of immediate contacts.
  • A social network is an idle, inactive system—a Rolodex of contacts, a notebook of sales targets, a yearbook of possible soul mates. But social media is active—hyperactive, really—spewing material across those networks instead of leaving them alone until needed.
  • soon enough, all social networks became social media first and foremost. When groups, pages, and the News Feed launched, Facebook began encouraging users to share content published by others in order to increase engagement on the service, rather than to provide updates to friends. LinkedIn launched a program to publish content across the platform, too. Twitter, already principally a publishing platform, added a dedicated “retweet” feature, making it far easier to spread content virally across user networks.
  • The authors propose social media as a system in which users participate in “information exchange.” The network, which had previously been used to establish and maintain relationships, becomes reinterpreted as a channel through which to broadcast.
  • The toxicity of social media makes it easy to forget how truly magical this innovation felt when it was new. From 2004 to 2009, you could join Facebook and everyone you’d ever known—including people you’d definitely lost track of—was right there, ready to connect or reconnect. The posts and photos I saw characterized my friends’ changing lives, not the conspiracy theories that their unhinged friends had shared with them
  • Twitter, which launched in 2006, was probably the first true social-media site, even if nobody called it that at the time. Instead of focusing on connecting people, the site amounted to a giant, asynchronous chat room for the world. Twitter was for talking to everyone—which is perhaps one of the reasons journalists have flocked to it
  • on Twitter, anything anybody posted could be seen instantly by anyone else. And furthermore, unlike posts on blogs or images on Flickr or videos on YouTube, tweets were short and low-effort, making it easy to post many of them a week or even a day.
  • This is also why journalists became so dependent on Twitter: It’s a constant stream of sources, events, and reactions—a reporting automat, not to mention an outbound vector for media tastemakers to make tastes.
  • When we look back at this moment, social media had already arrived in spirit if not by name. RSS readers offered a feed of blog posts to catch up on, complete with unread counts. MySpace fused music and chatter; YouTube did it with video (“Broadcast Yourself”)
  • a “web 2.0” revolution in “user-generated content,” offering easy-to-use, easily adopted tools on websites and then mobile apps. They were built for creating and sharing “content,”
  • Other services arrived or evolved in this vein, among them Reddit, Snapchat, and WhatsApp, all far more popular than Twitter. Social networks, once latent routes for possible contact, became superhighways of constant content
  • Although you can connect the app to your contacts and follow specific users, on TikTok, you are more likely to simply plug into a continuous flow of video content that has oozed to the surface via algorithm.
  • In the social-networking era, the connections were essential, driving both content creation and consumption. But the social-media era seeks the thinnest, most soluble connections possible, just enough to allow the content to flow.
  • Facebook and all the rest enjoyed a massive rise in engagement and the associated data-driven advertising profits that the attention-driven content economy created. The same phenomenon also created the influencer economy, in which individual social-media users became valuable as channels for distributing marketing messages or product sponsorships by means of their posts’ real or imagined reach
  • “influencer” became an aspirational role, especially for young people for whom Instagram fame seemed more achievable than traditional celebrity—or perhaps employment of any kind.
  • social-media operators discovered that the more emotionally charged the content, the better it spread across its users’ networks. Polarizing, offensive, or just plain fraudulent information was optimized for distribution. By the time the platforms realized and the public revolted, it was too late to turn off these feedback loops.
  • When network connections become activated for any reason or no reason, then every connection seems worthy of traversing.
  • Rounding up friends or business contacts into a pen in your online profile for possible future use was never a healthy way to understand social relationships.
  • when social networking evolved into social media, user expectations escalated. Driven by venture capitalists’ expectations and then Wall Street’s demands, the tech companies—Google and Facebook and all the rest—became addicted to massive scale
  • Social media showed that everyone has the potential to reach a massive audience at low cost and high gain—and that potential gave many people the impression that they deserve such an audience.
  • On social media, everyone believes that anyone to whom they have access owes them an audience: a writer who posted a take, a celebrity who announced a project, a pretty girl just trying to live her life, that anon who said something afflictive
  • The ensuing disaster was multipar
  • people just aren’t meant to talk to one another this much. They shouldn’t have that much to say, they shouldn’t expect to receive such a large audience for that expression, and they shouldn’t suppose a right to comment or rejoinder for every thought or notion either.
  • From being asked to review every product you buy to believing that every tweet or Instagram image warrants likes or comments or follows, social media produced a positively unhinged, sociopathic rendition of human sociality.
  • That’s no surprise, I guess, given that the model was forged in the fires of Big Tech companies such as Facebook, where sociopathy is a design philosophy.
  • If change is possible, carrying it out will be difficult, because we have adapted our lives to conform to social media’s pleasures and torments. It’s seemingly as hard to give up on social media as it was to give up smoking en masse
  • Quitting that habit took decades of regulatory intervention, public-relations campaigning, social shaming, and aesthetic shifts. At a cultural level, we didn’t stop smoking just because the habit was unpleasant or uncool or even because it might kill us. We did so slowly and over time, by forcing social life to suffocate the practice. That process must now begin in earnest for social media.
  • Something may yet survive the fire that would burn it down: social networks, the services’ overlooked, molten core. It was never a terrible idea, at least, to use computers to connect to one another on occasion, for justified reasons, and in moderation
  • The problem came from doing so all the time, as a lifestyle, an aspiration, an obsession. The offer was always too good to be true, but it’s taken us two decades to realize the Faustian nature of the bargain.
  • when I first wrote about downscale, the ambition seemed necessary but impossible. It still feels unlikely—but perhaps newly plausible.
  • To win the soul of social life, we must learn to muzzle it again, across the globe, among billions of people. To speak less, to fewer people and less often–and for them to do the same to you, and everyone else as well
  • We cannot make social media good, because it is fundamentally bad, deep in its very structure. All we can do is hope that it withers away, and play our small part in helping abandon it.
Javier E

Collapsing Levels of Trust Are Devastating America - The Atlantic - 0 views

  • American history is driven by periodic moments of moral convulsion
  • Harvard political scientist Samuel P. Huntington noticed that these convulsions seem to hit the United States every 60 years or so: the Revolutionary period of the 1760s and ’70s; the Jacksonian uprising of the 1820s and ’30s; the Progressive Era, which began in the 1890s; and the social-protest movements of the 1960s and early ’70s
  • A highly moralistic generation appears on the scene. It uses new modes of communication to seize control of the national conversation. Groups formerly outside of power rise up and take over the system. These are moments of agitation and excitement, frenzy and accusation, mobilization and passion.
  • ...168 more annotations...
  • In 1981, Huntington predicted that the next moral convulsion would hit America around the second or third decade of the 21st century—that is, right about now.
  • Trump is the final instrument of this crisis, but the conditions that brought him to power and make him so dangerous at this moment were decades in the making, and those conditions will not disappear if he is defeated.
  • Social trust is a measure of the moral quality of a society—of whether the people and institutions in it are trustworthy, whether they keep their promises and work for the common g
  • When people in a society lose faith or trust in their institutions and in each other, the nation collapses.
  • This is an account of how, over the past few decades, America became a more untrustworthy society
  • under the stresses of 2020, American institutions and the American social order crumbled and were revealed as more untrustworthy still
  • We had a chance, in crisis, to pull together as a nation and build trust. We did not. That has left us a broken, alienated society caught in a distrust doom loop.
  • The Baby Boomers grew up in the 1950s and ’60s, an era of family stability, widespread prosperity, and cultural cohesion. The mindset they embraced in the late ’60s and have embodied ever since was all about rebelling against authority, unshackling from institutions, and celebrating freedom, individualism, and liberation.
  • The emerging generations today enjoy none of that sense of security. They grew up in a world in which institutions failed, financial systems collapsed, and families were fragile. Children can now expect to have a lower quality of life than their parents, the pandemic rages, climate change looms, and social media is vicious. Their worldview is predicated on threat, not safety.
  • Thus the values of the Millennial and Gen Z generations that will dominate in the years ahead are the opposite of Boomer values: not liberation, but security; not freedom, but equality; not individualism, but the safety of the collective; not sink-or-swim meritocracy, but promotion on the basis of social justice
  • A new culture is dawning. The Age of Precarity is here.
  • I’ve spent my career rebutting the idea that America is in decline, but the events of these past six years, and especially of 2020, have made clear that we live in a broken nation. The cancer of distrust has spread to every vital organ.
  • Those were the days of triumphant globalization. Communism was falling. Apartheid was ending. The Arab-Israeli dispute was calming down. Europe was unifying. China was prospering. In the United States, a moderate Republican president, George H. W. Bush, gave way to the first Baby Boomer president, a moderate Democrat, Bill Clinton.
  • The stench of national decline is in the air. A political, social, and moral order is dissolving. America will only remain whole if we can build a new order in its place.
  • The American economy grew nicely. The racial wealth gap narrowed. All the great systems of society seemed to be working: capitalism, democracy, pluralism, diversity, globalization. It seemed, as Francis Fukuyama wrote in his famous “The End of History?” essay for The National Interest, “an unabashed victory for economic and political liberalism.”
  • Nations with low social trust—like Brazil, Morocco, and Zimbabwe—have struggling economies.
  • We think of the 1960s as the classic Boomer decade, but the false summer of the 1990s was the high-water mark of that ethos
  • The first great theme of that era was convergence. Walls were coming down. Everybody was coming together.
  • The second theme was the triumph of classical liberalism. Liberalism was not just a philosophy—it was a spirit and a zeitgeist, a faith that individual freedom would blossom in a loosely networked democratic capitalist world. Enterprise and creativity would be unleashed. America was the great embodiment and champion of this liberation.
  • The third theme was individualism. Society flourished when individuals were liberated from the shackles of society and the state, when they had the freedom to be true to themselves.
  • For his 2001 book, Moral Freedom, the political scientist Alan Wolfe interviewed a wide array of Americans. The moral culture he described was no longer based on mainline Protestantism, as it had been for generations
  • Instead, Americans, from urban bobos to suburban evangelicals, were living in a state of what he called moral freedom: the belief that life is best when each individual finds his or her own morality—inevitable in a society that insists on individual freedom.
  • moral freedom, like the other dominant values of the time, contained within it a core assumption: If everybody does their own thing, then everything will work out for everybody.
  • This was an ideology of maximum freedom and minimum sacrifice.
  • It all looks naive now. We were naive about what the globalized economy would do to the working class, naive to think the internet would bring us together, naive to think the global mixing of people would breed harmony, naive to think the privileged wouldn’t pull up the ladders of opportunity behind them
  • Over the 20 years after I sat with Kosieva, it all began to unravel. The global financial crisis had hit, the Middle East was being ripped apart by fanatics. On May 15, 2011, street revolts broke out in Spain, led by the self-declared Indignados—“the outraged.” “They don’t represent us!” they railed as an insult to the Spanish establishment. It would turn out to be the cry of a decade.
  • Millennials and members of Gen Z have grown up in the age of that disappointment, knowing nothing else. In the U.S. and elsewhere, this has produced a crisis of faith, across society but especially among the young. It has produced a crisis of trust.
  • Social trust is a generalized faith in the people of your community. It consists of smaller faiths. It begins with the assumption that we are interdependent, our destinies linked. It continues with the assumption that we share the same moral values. We share a sense of what is the right thing to do in different situations
  • gh-trust societies have what Fukuyama calls spontaneous sociability. People are able to organize more quickly, initiate action, and sacrifice for the common good.
  • When you look at research on social trust, you find all sorts of virtuous feedback loops. Trust produces good outcomes, which then produce more trust. In high-trust societies, corruption is lower and entrepreneurship is catalyzed.
  • Higher-trust nations have lower economic inequality, because people feel connected to each other and are willing to support a more generous welfare state.
  • People in high-trust societies are more civically engaged. Nations that score high in social trust—like the Netherlands, Sweden, China, and Australia—have rapidly growing or developed economies.
  • Renewal is hard to imagine. Destruction is everywhere, and construction difficult to see.
  • As the ethicist Sissela Bok once put it, “Whatever matters to human beings, trust is the atmosphere in which it thrives.”
  • During most of the 20th century, through depression and wars, Americans expressed high faith in their institutions
  • In 1964, for example, 77 percent of Americans said they trusted the federal government to do the right thing most or all of the time.
  • By 1994, only one in five Americans said they trusted government to do the right thing.
  • Then came the Iraq War and the financial crisis and the election of Donald Trump. Institutional trust levels remained pathetically low. What changed was the rise of a large group of people who were actively and poi
  • sonously alienated—who were not only distrustful but explosively distrustful. Explosive distrust is not just an absence of trust or a sense of detached alienation—it is an aggressive animosity and an urge to destroy. Explosive distrust is the belief that those who disagree with you are not just wrong but illegitimate
  • In 1997, 64 percent of Americans had a great or good deal of trust in the political competence of their fellow citizens; today only a third of Americans feel that way.
  • In most societies, interpersonal trust is stable over the decades. But for some—like Denmark, where about 75 percent say the people around them are trustworthy, and the Netherlands, where two-thirds say so—the numbers have actually risen.
  • In America, interpersonal trust is in catastrophic decline. In 2014, according to the General Social Survey conducted by NORC at the University of Chicago, only 30.3 percent of Americans agreed that “most people can be trusted,”
  • Today, a majority of Americans say they don’t trust other people when they first meet them.
  • There’s evidence to suggest that marital infidelity, academic cheating, and animal cruelty are all on the rise in America, but it’s hard to directly measure the overall moral condition of society—how honest people are, and how faithful.
  • Trust is the ratio between the number of people who betray you and the number of people who remain faithful to you. It’s not clear that there is more betrayal in America than there used to be—but there are certainly fewer faithful supports around people than there used to be.
  • Hundreds of books and studies on declining social capital and collapsing family structure demonstrate this. In the age of disappointment, people are less likely to be surrounded by faithful networks of people they can trust.
  • Black Americans have high trust in other Black Americans; it’s the wider society they don’t trust, for good and obvious reasons
  • As Vallier puts it, trust levels are a reflection of the moral condition of a nation at any given time.
  • high national trust is a collective moral achievement.
  • High national distrust is a sign that people have earned the right to be suspicious. Trust isn’t a virtue—it’s a measure of other people’s virtue.
  • Unsurprisingly, the groups with the lowest social trust in America are among the most marginalized.
  • Black Americans have been one of the most ill-treated groups in American history; their distrust is earned distrust
  • In 2018, 37.3 percent of white Americans felt that most people can be trusted, according to the General Social Survey, but only 15.3 percent of Black Americans felt the same.
  • People become trusting when the world around them is trustworthy. When they are surrounded by people who live up to their commitments. When they experience their country as a fair place.
  • In 2002, 43 percent of Black Americans were very or somewhat satisfied with the way Black people are treated in the U.S. By 2018, only 18 percent felt that way, according to Gallup.
  • The second disenfranchised low-trust group includes the lower-middle class and the working poor.
  • this group makes up about 40 percent of the country.
  • “They are driven by the insecurity of their place in society and in the economy,” he says. They are distrustful of technology and are much more likely to buy into conspiracy theories. “They’re often convinced by stories that someone is trying to trick them, that the world is against them,”
  • the third marginalized group that scores extremely high on social distrust: young adults. These are people who grew up in the age of disappointment. It’s the only world they know.
  • In 2012, 40 percent of Baby Boomers believed that most people can be trusted, as did 31 percent of members of Generation X. In contrast, only 19 percent of Millennials said most people can be trusted
  • Seventy-three percent of adults under 30 believe that “most of the time, people just look out for themselves,” according to a Pew survey from 2018. Seventy-one percent of those young adults say that most people “would try to take advantage of you if they got a chance.
  • A mere 10 percent of Gen Zers trust politicians to do the right thing.
  • Only 35 percent of young people, versus 67 percent of old people, believe that Americans respect the rights of people who are not like them.
  • Fewer than a third of Millennials say America is the greatest country in the world, compared to 64 percent of members of the Silent Generation.
  • “values and behavior are shaped by the degree to which survival is secure.” In the age of disappointment, our sense of safety went away
  • Some of this is physical insecurity: school shootings, terrorist attacks, police brutality, and overprotective parenting at home
  • the true insecurity is financial, social, and emotional.
  • By the time the Baby Boomers hit a median age of 35, their generation owned 21 percent of the nation’s wealth
  • First, financial insecurity
  • As of last year, Millennials—who will hit an average age of 35 in three years—owned just 3.2 percent of the nation’s wealth.
  • Next, emotional insecurity:
  • fewer children growing up in married two-parent households, more single-parent households, more depression, and higher suicide rates.
  • Then, identity insecurity.
  • All the traits that were once assigned to you by your community, you must now determine on your own: your identity, your morality, your gender, your vocation, your purpose, and the place of your belonging. Self-creation becomes a major anxiety-inducing act of young adulthood.
  • liquid modernity
  • Finally, social insecurity.
  • n the age of social media our “sociometers”—the antennae we use to measure how other people are seeing us—are up and on high alert all the time. Am I liked? Am I affirmed?
  • Danger is ever present. “For many people, it is impossible to think without simultaneously thinking about what other people would think about what you’re thinking,” the educator Fredrik deBoer has written. “This is exhausting and deeply unsatisfying. As long as your self-conception is tied up in your perception of other people’s conception of you, you will never be free to occupy a personality with confidence; you’re always at the mercy of the next person’s dim opinion of you and your whole deal.”
  • In this world, nothing seems safe; everything feels like chaos.
  • Distrust sows distrust. It produces the spiritual state that Emile Durkheim called anomie, a feeling of being disconnected from society, a feeling that the whole game is illegitimate, that you are invisible and not valued, a feeling that the only person you can really trust is yourself.
  • People plagued by distrust can start to see threats that aren’t there; they become risk averse
  • Americans take fewer risks and are much less entrepreneurial than they used to be. In 2014, the rate of business start-ups hit a nearly 40-year low. Since the early 1970s, the rate at which people move across state lines each year has dropped by 56 percent
  • People lose faith in experts. They lose faith in truth, in the flow of information that is the basis of modern society. “A world of truth is a world of trust, and vice versa,”
  • In periods of distrust, you get surges of populism; populism is the ideology of those who feel betrayed
  • People are drawn to leaders who use the language of menace and threat, who tell group-versus-group power narratives. You also get a lot more political extremism. People seek closed, rigid ideological systems that give them a sense of security.
  • fanaticism is a response to existential anxiety. When people feel naked and alone, they revert to tribe. Their radius of trust shrinks, and they only trust their own kind.
  • When many Americans see Trump’s distrust, they see a man who looks at the world as they do.
  • By February 2020, America was a land mired in distrust. Then the plague arrived.
  • From the start, the pandemic has hit the American mind with sledgehammer force. Anxiety and depression have spiked. In April, Gallup recorded a record drop in self-reported well-being, as the share of Americans who said they were thriving fell to the same low point as during the Great Recession
  • These kinds of drops tend to produce social upheavals. A similar drop was seen in Tunisian well-being just before the street protests that led to the Arab Spring.
  • The emotional crisis seems to have hit low-trust groups the hardest
  • “low trusters” were more nervous during the early months of the pandemic, more likely to have trouble sleeping, more likely to feel depressed, less likely to say the public authorities were responding well to the pandemic
  • Eighty-one percent of Americans under 30 reported feeling anxious, depressed, lonely, or hopeless at least one day in the previous week, compared to 48 percent of adults 60 and over.
  • Americans looked to their governing institutions to keep them safe. And nearly every one of their institutions betrayed them
  • The president downplayed the crisis, and his administration was a daily disaster area
  • The Centers for Disease Control and Prevention produced faulty tests, failed to provide up-to-date data on infections and deaths, and didn’t provide a trustworthy voice for a scared public.
  • The Food and Drug Administration wouldn’t allow private labs to produce their own tests without a lengthy approval process.
  • In nations that ranked high on the World Values Survey measure of interpersonal trust—like China, Australia, and most of the Nordic states—leaders were able to mobilize quickly, come up with a plan, and count on citizens to comply with the new rules.
  • In low-trust nations—like Mexico, Spain, and Brazil—there was less planning, less compliance, less collective action, and more death.
  • Countries that fell somewhere in the middle—including the U.S., Germany, and Japan—had a mixed record depending on the quality of their leadership.
  • South Korea, where more than 65 percent of people say they trust government when it comes to health care, was able to build a successful test-and-trace regime. In America, where only 31 percent of Republicans and 44 percent of Democrats say the government should be able to use cellphone data to track compliance with experts’ coronavirus social-contact guidelines, such a system was never really implemented.
  • For decades, researchers have been warning about institutional decay. Institutions get caught up in one of those negative feedback loops that are so common in a world of mistrust. They become ineffective and lose legitimacy. People who lose faith in them tend not to fund them. Talented people don’t go to work for them. They become more ineffective still.
  • On the right, this anti-institutional bias has manifested itself as hatred of government; an unwillingness to defer to expertise, authority, and basic science; and a reluctance to fund the civic infrastructure of society, such as a decent public health system
  • On the left, distrust of institutional authority has manifested as a series of checks on power that have given many small actors the power to stop common plans, producing what Fukuyama calls a vetocracy
  • In 2020, American institutions groaned and sputtered. Academics wrote up plan after plan and lobbed them onto the internet. Few of them went anywhere. America had lost the ability to build new civic structures to respond to ongoing crises like climate change, opioid addiction, and pandemics, or to reform existing ones.
  • In a lower-trust era like today, Levin told me, “there is a greater instinct to say, ‘They’re failing us.’ We see ourselves as outsiders to the systems—an outsider mentality that’s hard to get out of.”
  • Americans haven’t just lost faith in institutions; they’ve come to loathe them, even to think that they are evil
  • 55 percent of Americans believe that the coronavirus that causes COVID-19 was created in a lab and 59 percent believe that the U.S. government is concealing the true number of deaths
  • Half of all Fox News viewers believe that Bill Gates is plotting a mass-vaccination campaign so he can track people.
  • This spring, nearly a third of Americans were convinced that it was probably or definitely true that a vaccine existed but was being withheld by the government.
  • institutions like the law, the government, the police, and even the family don’t merely serve social functions, Levin said; they form the individuals who work and live within them. The institutions provide rules to live by, standards of excellence to live up to, social roles to fulfill.
  • By 2020, people had stopped seeing institutions as places they entered to be morally formed,
  • Instead, they see institutions as stages on which they can perform, can display their splendid selves.
  • People run for Congress not so they can legislate, but so they can get on TV. People work in companies so they can build their personal brand.
  • The result is a world in which institutions not only fail to serve their social function and keep us safe, they also fail to form trustworthy people. The rot in our structures spreads to a rot in ourselves.
  • The Failure of Society
  • The coronavirus has confronted America with a social dilemma. A social dilemma, the University of Pennsylvania scholar Cristina Bicchieri notes, is “a situation in which each group member gets a higher outcome if she pursues her individual self-interest, but everyone in the group is better off if all group members further the common interest.”
  • Social distancing is a social dilemma. Many low-risk individuals have been asked to endure some large pain (unemployment, bankruptcy) and some small inconvenience (mask wearing) for the sake of the common good. If they could make and keep this moral commitment to each other in the short term, the curve would be crushed, and in the long run we’d all be better off. It is the ultimate test of American trustworthiness.
  • While pretending to be rigorous, people relaxed and started going out. It was like watching somebody gradually give up on a diet. There wasn’t a big moment of capitulation, just an extra chocolate bar here, a bagel there, a scoop of ice cream before bed
  • in reality this was a mass moral failure of Republicans and Democrats and independents alike. This was a failure of social solidarity, a failure to look out for each other.
  • Alexis de Tocqueville discussed a concept called the social body. Americans were clearly individualistic, he observed, but they shared common ideas and common values, and could, when needed, produce common action. They could form a social body.
  • Over time, those common values eroded, and were replaced by a value system that put personal freedom above every other value
  • When Americans were confronted with the extremely hard task of locking down for months without any of the collective resources that would have made it easier—habits of deference to group needs; a dense network of community bonds to help hold each other accountable; a history of trust that if you do the right thing, others will too; preexisting patterns of cooperation; a sense of shame if you deviate from the group—they couldn’t do it. America failed.
  • The Crack-up
  • This wasn’t just a political and social crisis, it was also an emotional trauma.
  • The week before George Floyd was killed, the National Center for Health Statistics released data showing that a third of all Americans were showing signs of clinical anxiety or depression. By early June, after Floyd’s death, the percentage of Black Americans showing clinical signs of depression and anxiety disorders had jumped from 36 to 41 percent
  • By late June, American national pride was lower than at any time since Gallup started measuring, in 2001
  • In another poll, 71 percent of Americans said they were angry about the state of the country, and just 17 percent said they were proud.
  • By late June, it was clear that America was enduring a full-bore crisis of legitimacy, an epidemic of alienation, and a loss of faith in the existing order.
  • The most alienated, anarchic actors in society—antifa, the Proud Boys, QAnon—seemed to be driving events. The distrust doom loop was now at hand.
  • The Age of Precarity
  • Cultures are collective responses to common problems. But when reality changes, culture takes a few years, and a moral convulsion, to completely shake off the old norms and values.
  • The culture that is emerging, and which will dominate American life over the next decades, is a response to a prevailing sense of threat.
  • This new culture values security over liberation, equality over freedom, the collective over the individual.
  • From risk to security.
  • we’ve entered an age of precarity in which every political or social movement has an opportunity pole and a risk pole. In the opportunity mentality, risk is embraced because of the upside possibilities. In the risk mindset, security is embraced because people need protection from downside dangers
  • In this period of convulsion, almost every party and movement has moved from its opportunity pole to its risk pole.
  • From achievement to equality
  • In the new culture we are entering, that meritocratic system looks more and more like a ruthless sorting system that excludes the vast majority of people, rendering their life precarious and second class, while pushing the “winners” into a relentless go-go lifestyle that leaves them exhausted and unhappy
  • Equality becomes the great social and political goal. Any disparity—racial, economic, meritocratic—comes to seem hateful.
  • From self to society
  • If we’ve lived through an age of the isolated self, people in the emerging culture see embedded selves. Socialists see individuals embedded in their class group. Right-wing populists see individuals as embedded pieces of a national identity group. Left-wing critical theorists see individuals embedded in their racial, ethnic, gender, or sexual-orientation identity group.
  • The cultural mantra shifts from “Don’t label me!” to “My label is who I am.”
  • From global to local
  • When there is massive distrust of central institutions, people shift power to local institutions, where trust is higher. Power flows away from Washington to cities and states.
  • From liberalism to activism
  • enlightenment liberalism, which was a long effort to reduce the role of passions in politics and increase the role of reason. Politics was seen as a competition between partial truths.
  • Liberalism is ill-suited for an age of precarity. It demands that we live with a lot of ambiguity, which is hard when the atmosphere already feels unsafe. Furthermore, it is thin. It offers an open-ended process of discovery when what people hunger for is justice and moral certainty.
  • liberalism’s niceties come to seem like a cover that oppressors use to mask and maintain their systems of oppression. Public life isn’t an exchange of ideas; it’s a conflict of groups engaged in a vicious death struggle
  • The cultural shifts we are witnessing offer more safety to the individual at the cost of clannishness within society. People are embedded more in communities and groups, but in an age of distrust, groups look at each other warily, angrily, viciously.
  • The shift toward a more communal viewpoint is potentially a wonderful thing, but it leads to cold civil war unless there is a renaissance of trust. There’s no avoiding the core problem. Unless we can find a way to rebuild trust, the nation does not function.
  • How to Rebuild Trust
  • Historians have more to offer, because they can cite examples of nations that have gone from pervasive social decay to relative social health. The two most germane to our situation are Great Britain between 1830 and 1848 and the United States between 1895 and 1914.
  • In both periods, a highly individualistic and amoral culture was replaced by a more communal and moralistic one.
  • But there was a crucial difference between those eras and our own, at least so far. In both cases, moral convulsion led to frenetic action.
  • As Robert Putnam and Shaylyn Romney Garrett note in their forthcoming book, The Upswing, the American civic revival that began in the 1870s produced a stunning array of new organizations: the United Way, the NAACP, the Boy Scouts, the Forest Service, the Federal Reserve System, 4-H clubs, the Sierra Club, the settlement-house movement, the compulsory-education movement, the American Bar Association, the American Legion, the ACLU, and on and on
  • After the civic revivals, both nations witnessed frenetic political reform. During the 1830s, Britain passed the Reform Act, which widened the franchise; the Factory Act, which regulated workplaces; and the Municipal Corporations Act, which reformed local government.
  • The Progressive Era in America saw an avalanche of reform: civil-service reform; food and drug regulation; the Sherman Act, which battled the trusts; the secret ballot; and so on. Civic life became profoundly moralistic, but political life became profoundly pragmatic and anti-ideological. Pragmatism and social-science expertise were valued.
  • Can America in the 2020s turn itself around the way the America of the 1890s, or the Britain of the 1830s, did? Can we create a civic renaissance and a legislative revolution?
  • I see no scenario in which we return to being the nation we were in 1965, with a cohesive national ethos, a clear national establishment, trusted central institutions, and a pop-culture landscape in which people overwhelmingly watch the same shows and talked about the same things.
  • The age of distrust has smashed the converging America and the converging globe—that great dream of the 1990s—and has left us with the reality that our only plausible future is decentralized pluralism.
  • The key to making decentralized pluralism work still comes down to one question: Do we have the energy to build new organizations that address our problems, the way the Brits did in the 1830s and Americans did in the 1890s?
  • social trust is built within organizations in which people are bound together to do joint work, in which they struggle together long enough for trust to gradually develop, in which they develop shared understandings of what is expected of each other, in which they are enmeshed in rules and standards of behavior that keep them trustworthy when their commitments might otherwise falter.
  • Over the past 60 years, we have given up on the Rotary Club and the American Legion and other civic organizations and replaced them with Twitter and Instagram. Ultimately, our ability to rebuild trust depends on our ability to join and stick to organizations.
  • Whether we emerge from this transition stronger depends on our ability, from the bottom up and the top down, to build organizations targeted at our many problems. If history is any guide, this will be the work not of months, but of one or two decades.
  • For centuries, America was the greatest success story on earth, a nation of steady progress, dazzling achievement, and growing international power. That story threatens to end on our watch, crushed by the collapse of our institutions and the implosion of social trust
  • But trust can be rebuilt through the accumulation of small heroic acts—by the outrageous gesture of extending vulnerability in a world that is mean, by proffering faith in other people when that faith may not be returned. Sometimes trust blooms when somebody holds you against all logic, when you expected to be dropped.
  • By David Brooks
Javier E

Opinion | It's Time to Break Up Facebook - The New York Times - 1 views

  • For many people today, it’s hard to imagine government doing much of anything right, let alone breaking up a company like Facebook. This isn’t by coincidence.
  • Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones
  • Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective. By the mid-1980s, they had largely managed to relegate energetic antitrust enforcement to the history books.
  • ...51 more annotations...
  • This shift, combined with business-friendly tax and regulatory policy, ushered in a period of mergers and acquisitions that created megacorporations
  • In the past 20 years, more than 75 percent of American industries, from airlines to pharmaceuticals, have experienced increased concentration, and the average size of public companies has tripled. The results are a decline in entrepreneurship, stalled productivity growth, and higher prices and fewer choices for consumers.
  • Because Facebook so dominates social networking, it faces no market-based accountability. This means that every time Facebook messes up, we repeat an exhausting pattern: first outrage, then disappointment and, finally, resignation.
  • Over a decade later, Facebook has earned the prize of domination. It is worth half a trillion dollars and commands, by my estimate, more than 80 percent of the world’s social networking revenue. It is a powerful monopoly, eclipsing all of its rivals and erasing competition from the social networking category.
  • Facebook’s monopoly is also visible in its usage statistics. About 70 percent of American adults use social media, and a vast majority are on Facebook products
  • Over two-thirds use the core site, a third use Instagram, and a fifth use WhatsApp.
  • As a result of all this, would-be competitors can’t raise the money to take on Facebook. Investors realize that if a company gets traction, Facebook will copy its innovations, shut it down or acquire it for a relatively modest sum
  • Facebook’s dominance is not an accident of history. The company’s strategy was to beat every competitor in plain view, and regulators and the government tacitly — and at times explicitly — approved
  • The F.T.C.’s biggest mistake was to allow Facebook to acquire Instagram and WhatsApp. In 2012, the newer platforms were nipping at Facebook’s heels because they had been built for the smartphone, where Facebook was still struggling to gain traction. Mark responded by buying them, and the F.T.C. approved.
  • Neither Instagram nor WhatsApp had any meaningful revenue, but both were incredibly popular. The Instagram acquisition guaranteed Facebook would preserve its dominance in photo networking, and WhatsApp gave it a new entry into mobile real-time messaging.
  • When it hasn’t acquired its way to dominance, Facebook has used its monopoly position to shut out competing companies or has copied their technology.
  • In 2014, the rules favored curiosity-inducing “clickbait” headlines. In 2016, they enabled the spread of fringe political views and fake news, which made it easier for Russian actors to manipulate the American electorate.
  • As markets become more concentrated, the number of new start-up businesses declines. This holds true in other high-tech areas dominated by single companies, like search (controlled by Google) and e-commerce (taken over by Amazon)
  • I don’t blame Mark for his quest for domination. He has demonstrated nothing more nefarious than the virtuous hustle of a talented entrepreneur
  • It’s on our government to ensure that we never lose the magic of the invisible hand. How did we allow this to happen
  • a narrow reliance on whether or not consumers have experienced price gouging fails to take into account the full cost of market domination
  • It doesn’t recognize that we also want markets to be competitive to encourage innovation and to hold power in check. And it is out of step with the history of antitrust law. Two of the last major antitrust suits, against AT&T and IBM in the 1980s, were grounded in the argument that they had used their size to stifle innovation and crush competition.
  • It is a disservice to the laws and their intent to retain such a laserlike focus on price effects as the measure of all that antitrust was meant to do.”
  • Facebook is the perfect case on which to reverse course, precisely because Facebook makes its money from targeted advertising, meaning users do not pay to use the service. But it is not actually free, and it certainly isn’t harmless.
  • We pay for Facebook with our data and our attention, and by either measure it doesn’t come cheap.
  • The choice is mine, but it doesn’t feel like a choice. Facebook seeps into every corner of our lives to capture as much of our attention and data as possible and, without any alternative, we make the trade.
  • The vibrant marketplace that once drove Facebook and other social media companies to compete to come up with better products has virtually disappeared. This means there’s less chance of start-ups developing healthier, less exploitative social media platforms. It also means less accountability on issues like privacy.
  • The most problematic aspect of Facebook’s power is Mark’s unilateral control over speech. There is no precedent for his ability to monitor, organize and even censor the conversations of two billion people.
  • Facebook engineers write algorithms that select which users’ comments or experiences end up displayed in the News Feeds of friends and family. These rules are proprietary and so complex that many Facebook employees themselves don’t understand them.
  • What started out as lighthearted entertainment has become the primary way that people of all ages communicate online.
  • In January 2018, Mark announced that the algorithms would favor non-news content shared by friends and news from “trustworthy” sources, which his engineers interpreted — to the confusion of many — as a boost for anything in the category of “politics, crime, tragedy.”
  • As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns.
  • No one at Facebook headquarters is choosing what single news story everyone in America wakes up to, of course. But they do decide whether it will be an article from a reputable outlet or a clip from “The Daily Show,” a photo from a friend’s wedding or an incendiary call to kill others.
  • Mark knows that this is too much power and is pursuing a twofold strategy to mitigate it. He is pivoting Facebook’s focus toward encouraging more private, encrypted messaging that Facebook’s employees can’t see, let alone control
  • Second, he is hoping for friendly oversight from regulators and other industry executives.
  • In an op-ed essay in The Washington Post in March, he wrote, “Lawmakers often tell me we have too much power over speech, and I agree.” And he went even further than before, calling for more government regulation — not just on speech, but also on privacy and interoperability, the ability of consumers to seamlessly leave one network and transfer their profiles, friend connections, photos and other data to another.
  • I don’t think these proposals were made in bad faith. But I do think they’re an attempt to head off the argument that regulators need to go further and break up the company. Facebook isn’t afraid of a few more rules. It’s afraid of an antitrust case and of the kind of accountability that real government oversight would bring.
  • We don’t expect calcified rules or voluntary commissions to work to regulate drug companies, health care companies, car manufacturers or credit card providers. Agencies oversee these industries to ensure that the private market works for the public good. In these cases, we all understand that government isn’t an external force meddling in an organic market; it’s what makes a dynamic and fair market possible in the first place. This should be just as true for social networking as it is for air travel or pharmaceuticals.
  • Just breaking up Facebook is not enough. We need a new agency, empowered by Congress to regulate tech companies. Its first mandate should be to protect privacy.
  • First, Facebook should be separated into multiple companies. The F.T.C., in conjunction with the Justice Department, should enforce antitrust laws by undoing the Instagram and WhatsApp acquisitions and banning future acquisitions for several years.
  • How would a breakup work? Facebook would have a brief period to spin off the Instagram and WhatsApp businesses, and the three would become distinct companies, most likely publicly traded.
  • Facebook is indeed more valuable when there are more people on it: There are more connections for a user to make and more content to be shared. But the cost of entering the social network business is not that high. And unlike with pipes and electricity, there is no good argument that the country benefits from having only one dominant social networking company.
  • others worry that the breakup of Facebook or other American tech companies could be a national security problem. Because advancements in artificial intelligence require immense amounts of data and computing power, only large companies like Facebook, Google and Amazon can afford these investments, they say. If American companies become smaller, the Chinese will outpace us.
  • The American government needs to do two things: break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.
  • But the biggest winners would be the American people. Imagine a competitive market in which they could choose among one network that offered higher privacy standards, another that cost a fee to join but had little advertising and another that would allow users to customize and tweak their feeds as they saw fit
  • The cost of breaking up Facebook would be next to zero for the government, and lots of people stand to gain economically. A ban on short-term acquisitions would ensure that competitors, and the investors who take a bet on them, would have the space to flourish. Digital advertisers would suddenly have multiple companies vying for their dollars.
  • The Europeans have made headway on privacy with the General Data Protection Regulation, a law that guarantees users a minimal level of protection. A landmark privacy bill in the United States should specify exactly what control Americans have over their digital information, require clearer disclosure to users and provide enough flexibility to the agency to exercise effective oversight over time
  • The agency should also be charged with guaranteeing basic interoperability across platforms.
  • Finally, the agency should create guidelines for acceptable speech on social media
  • We will have to create similar standards that tech companies can use. These standards should of course be subject to the review of the courts, just as any other limits on speech are. But there is no constitutional right to harass others or live-stream violence.
  • These are difficult challenges. I worry that government regulators will not be able to keep up with the pace of digital innovation
  • I worry that more competition in social networking might lead to a conservative Facebook and a liberal one, or that newer social networks might be less secure if government regulation is weak
  • Professor Wu has written that this “policeman at the elbow” led IBM to steer clear “of anything close to anticompetitive conduct, for fear of adding to the case against it.”
  • Finally, an aggressive case against Facebook would persuade other behemoths like Google and Amazon to think twice about stifling competition in their own sectors, out of fear that they could be next.
  • The alternative is bleak. If we do not take action, Facebook’s monopoly will become even more entrenched. With much of the world’s personal communications in hand, it can mine that data for patterns and trends, giving it an advantage over competitors for decades to come.
  • This movement of public servants, scholars and activists deserves our support. Mark Zuckerberg cannot fix Facebook, but our government can.
Javier E

Facebook's Dangerous Experiment on Teen Girls - The Atlantic - 0 views

  • Much more than for boys, adolescence typically heightens girls’ self-consciousness about their changing body and amplifies insecurities about where they fit in their social network. Social media—particularly Instagram, which displaces other forms of interaction among teens, puts the size of their friend group on public display, and subjects their physical appearance to the hard metrics of likes and comment counts—takes the worst parts of middle school and glossy women’s magazines and intensifies them.
  • The preponderance of the evidence now available is disturbing enough to warrant action.
  • The toxicity comes from the very nature of a platform that girls use to post photographs of themselves and await the public judgments of others.
  • ...35 more annotations...
  • imilar increases occurred at the same time for girls in Canada for mood disorders and for self-harm. Girls in the U.K. also experienced very large increases in anxiety, depression, and self-harm (with much smaller increases for boys).
  • Some have argued that these increases reflect nothing more than Gen Z’s increased willingness to disclose their mental-health problems. But researchers have found corresponding increases in measurable behaviors such as suicide (for both sexes), and emergency-department admissions for self-harm (for girls only). From 2010 to 2014, rates of hospital admission for self-harm did not increase at all for women in their early 20s, or for boys or young men, but they doubled for girls ages 10 to 14.
  • The available evidence suggests that Facebook’s products have probably harmed millions of girls. If public officials want to make that case, it could go like this:
  • from 2010 to 2014, high-school students moved much more of their lives onto social-media platforms.
  • National surveys of American high-school students show that only about 63 percent reported using a “social networking site” on a daily basis back in 2010.
  • But as smartphone ownership increased, access became easier and visits became more frequent. By 2014, 80 percent of high-school students said they used a social-media platform on a daily basis, and 24 percent said that they were online “almost constantly.”
  • 2. The timing points to social media.
  • Notably, girls became much heavier users of the new visually oriented platforms, primarily Instagram (which by 2013 had more than 100 million users), followed by Snapchat, Pinterest, and Tumblr.
  • Boys are glued to their screens as well, but they aren’t using social media as much; they spend far more time playing video games. When a boy steps away from the console, he does not spend the next few hours worrying about what other players are saying about him
  • Instagram, in contrast, can loom in a girl’s mind even when the app is not open, driving hours of obsessive thought, worry, and shame.
  • 3. The victims point to Instagram.
  • In 2017, British researchers asked 1,500 teens to rate how each of the major social-media platforms affected them on certain well-being measures, including anxiety, loneliness, body image, and sleep. Instagram scored as the most harmful, followed by Snapchat and then Facebook.
  • Facebook’s own research, leaked by the whistleblower Frances Haugen, has a similar finding: “Teens blame Instagram for increases in the rate of anxiety and depression … This reaction was unprompted and consistent across all groups.” The researchers also noted that “social comparison is worse” on Instagram than on rival apps.
  • 4. No other suspect is equally plausible.
  • A recent experiment confirmed these observations: Young women were randomly assigned to use Instagram, use Facebook, or play a simple video game for seven minutes. The researchers found that “those who used Instagram, but not Facebook, showed decreased body satisfaction, decreased positive affect, and increased negative affect.”
  • Snapchat’s filters “keep the focus on the face,” whereas Instagram “focuses heavily on the body and lifestyle.
  • (Boys lost less, and may even have gained, when they took up multiplayer fantasy games, especially those that put them into teams.)
  • The subset of studies that allow researchers to isolate social media, and Instagram in particular, show a much stronger relationship with poor mental health. The same goes for those that zoom in on girls rather than all teens.
  • In a 2019 internal essay, Andrew Bosworth, a longtime company executive, wrote:While Facebook may not be nicotine I think it is probably like sugar. Sugar is delicious and for most of us there is a special place for it in our lives. But like all things it benefits from moderation.
  • Bosworth was proposing what medical researchers call a “dose-response relationship.” Sugar, salt, alcohol, and many other substances that are dangerous in large doses are harmless in small ones.
  • his framing also implies that any health problems caused by social media result from the user’s lack of self-control. That’s exactly what Bosworth concluded: “Each of us must take responsibility for ourselves.” The dose-response frame also points to cheap solutions that pose no threat to its business model. The company can simply offer more tools to help Instagram and Facebook users limit their consumption.
  • social-media platforms are not like sugar. They don’t just affect the individuals who overindulge. Rather, when teens went from texting their close friends on flip phones in 2010 to posting carefully curated photographs and awaiting comments and likes by 2014, the change rewired everyone’s social life.
  • Improvements in technology generally help friends connect, but the move onto social-media platforms also made it easier—indeed, almost obligatory––for users to perform for one another.
  • Public performance is risky. Private conversation is far more playful. A bad joke or poorly chosen word among friends elicits groans, or perhaps a rebuke and a chance to apologize. Getting repeated feedback in a low-stakes environment is one of the main ways that play builds social skills, physical skills, and the ability to properly judge risk. Play also strengthens friendships.
  • When girls started spending hours each day on Instagram, they lost many of the benefits of play.
  • First, Congress should pass legislation compelling Facebook, Instagram, and all other social-media platforms to allow academic researchers access to their data. One such bill is the Platform Transparency and Accountability Act, proposed by the Stanford University researcher Nate Persily.
  • The wrong photo can lead to school-wide or even national infamy, cyberbullying from strangers, and a permanent scarlet letter
  • Performative social media also puts girls into a trap: Those who choose not to play the game are cut off from their classmates
  • Instagram and, more recently, TikTok have become wired into the way teens interact, much as the telephone became essential to past generations.
  • f those platforms. Without a proper control group, we can’t be certain that the experiment has been a catastrophic failure, but it probably has been. Until someone comes up with a more plausible explanation for what has happened to Gen Z girls, the most prudent course of action for regulators, legislators, and parents is to take steps to mitigate the harm.
  • Correlation does not prove causation, but nobody has yet found an alternative explanation for the massive, sudden, gendered, multinational deterioration of teen mental health during the period in question.
  • Second, Congress should toughen the 1998 Children’s Online Privacy Protection Act. An early version of the legislation proposed 16 as the age at which children should legally be allowed to give away their data and their privacy.
  • Unfortunately, e-commerce companies lobbied successfully to have the age of “internet adulthood” set instead at 13. Now, more than two decades later, today’s 13-year-olds are not doing well. Federal law is outdated and inadequate. The age should be raised. More power should be given to parents, less to companies.
  • Third, while Americans wait for lawmakers to act, parents can work with local schools to establish a norm: Delay entry to Instagram and other social platforms until high school.
  • Right now, families are trapped. I have heard many parents say that they don’t want their children on Instagram, but they allow them to lie about their age and open accounts because, well, that’s what everyone else has done.
Javier E

ThinkUp Helps the Social Network User See the Online Self - NYTimes.com - 0 views

  • In addition to a list of people’s most-used words and other straightforward stats like follower counts, ThinkUp shows subscribers more unusual information such as how often they thank and congratulate people, how frequently they swear, whose voices they tend to amplify and which posts get the biggest reaction and from whom.
  • Every morning the service delivers an email packed with information, and in its weighty thoroughness, it reminds you that what you do on Twitter and Facebook can change your life, and other people’s lives, in important, sometimes unforeseen ways.
  • ThinkUp is something like Elf on the Shelf for digitally addled adults — a constant reminder that someone is watching you, and that you’re being judged.
  • ...14 more annotations...
  • after using ThinkUp for about six months, I’ve found it to be an indispensable guide to how I navigate social networks.
  • “The goal is to make you act like less of a jerk online,” Ms. Trapani said. “The big goal is to create mindfulness and awareness, and also behavioral change.”
  • people often tweet and update without any perspective about themselves. That’s because Facebook and Twitter, as others have observed, have a way of infecting our brains.
  • Because social networks often suggest a false sense of intimacy, they tend to lower people’s self-control.
  • Like a drug or perhaps a parasite, they worm into your devices, your daily habits and your every free moment, and they change how you think.Continue reading the main story Continue reading the main story
  • For those of us most deeply afflicted, myself included, every mundane observation becomes grist for a 140-character quip, and every interaction a potential springboard into an all-consuming, emotionally wrenching flame battle.
  • One of the biggest dangers is saying something off the cuff that might make sense in a particular context, but that sounds completely off the rails to the wider public. The problem, in other words, is acting without thinking — being caught up in the moment, without pausing to reflect on the long-term consequences. You’re never more than a few taps away from an embarrassment that might ruin your career, or at least your reputation, for years to come.
  • More basically, though, it’s helped me pull back from social networks. Each week, ThinkUp tells me how often I’ve tweeted. Sometimes that number is terribly high — a few weeks ago it was more than 800 times — and I realize I’m probably overtaxing my followers
  • getting a daily reminder from ThinkUp that there are good ways and bad ways to behave online — has a tendency to focus the mind.
  • even though “never tweet” became a popular, ironic thing to tweet this year, actually never tweeting, and never being on Facebook, is becoming nearly impossible for many people.
  • ThinkUp charges $5 a month for each social network you connect to it. Is it worth it? After all, there’s a better, more surefire way of avoiding any such long-term catastrophe caused by social media: Just stop using social networks.
  • your online profile plays an important role in how you’re perceived by potential employers. In a recent survey commissioned by the job-hunting site CareerBuilder, almost half of companies said they perused job-seekers’ social networking profiles to look for red flags and to see what sort of image prospective employees portrayed online.
  • The main issue constraining growth, the founders say, is that it has been difficult to explain to people why they might need ThinkUp.
  • That may change as more people falter on social networks, either by posting unthinking comments that end up damaging their careers, or simply by annoying people to the point that their online presence becomes a hindrance to their real-life prospects.
Javier E

Book review of The Square and the Tower: Networks and Power, from the Freemasons to Fac... - 0 views

  • Ferguson maintains that historians have paid too much attention to hierarchies (monarchies, empires, nation-states, governments, armies, corporations) and too little to the loose social networks that often end up disrupting them.
  • “traditional historical research relied heavily for its source material on the documents produced by hierarchical institutions such as states. Networks do keep records, but they are not so easy to find.”
  • The author argues that dismissing the role of social networks is a grave mistake because these loose organizational arrangements have been far more important in shaping history than most historians know or are prepared to accept
  • ...7 more annotations...
  • the power of networks has varied over time and that the relative importance of the tower and the square has ebbed and flowed. Nonetheless, Ferguson sees two specific periods as standing out as intensely “networked eras.” The first started in the late 15th century, after the introduction in Europe of the printing press, and lasted until the late 18th century. The second, “our own time,” began in the 1970s and is still going on.
  • from the late 1790s until the late 1960s, was terrible for networks. Ferguson writes that “hierarchical institutions re-established their control and successfully shut down or co-opted networks. The zenith of hierarchically organized power was in fact the mid-twentieth century — the era of totalitarian regimes and total war.”
  • “The Square and the Tower” will not disappoint readers who have come to expect from Ferguson ambition, erudition, originality and expansive historical panoramas. These often come mixed with telling anecdotes, illuminating minutiae, fun facts and even some facile one-liners that, while entertaining, don’t add much to the argument.
  • it is too much, and not all of it is illuminated by the “theoretical insights from myriad disciplines.” In fact, it is surprising how little Ferguson relies on the initial chapters on network theory to make his case.
  • In the remaining eight parts of the book, this network theory mostly disappears and the story is told in standard historical narrative.
  • its main unit of analysis, the social network, is too imprecise a concept to provide a solid foundation from which to launch the book’s epic theorizing. Most networks have some hierarchical features, and, as Ferguson notes, “a hierarchy is just a special kind of network
  • Nonetheless, the networks-and-hierarchies dichotomy does work as a narrative device that allows a gifted storyteller to take his readers on a fascinating tour of world history.
Javier E

Social Media and the Devolution of Friendship: Full Essay (Pts I & II) » Cybo... - 0 views

  • social networking sites create pressure to put time and effort into tending weak ties, and how it can be impossible to keep up with them all. Personally, I also find it difficult to keep up with my strong ties. I’m a great “pick up where we left off” friend, as are most of the people closest to me (makes sense, right?). I’m decidedly sub-awesome, however, at being in constant contact with more than a few people at a time.
  • the devolution of friendship. As I explain over the course of this essay, I link the devolution of friendship to—but do not “blame” it on—the affordances of various social networking platforms, especially (but not exclusively) so-called “frictionless sharing” features.
  • I’m using the word here in the same way that people use it to talk about the devolution of health care. One example of devolution of health care is some outpatient surgeries: patients are allowed to go home after their operations, but they still require a good deal of post-operative care such as changing bandages, irrigating wounds, administering medications, etc. Whereas before these patients would stay in the hospital and nurses would perform the care-labor necessary for their recoveries, patients must now find their own caregivers (usually family members or friends; sometimes themselves) to perform free care-labor. In this context, devolution marks the shift of labor and responsibility away from the medical establishment and onto the patient; within the patient-medical establishment collaboration, the patient must now provide a greater portion of the necessary work. Similarly, in some ways, we now expect our friends to do a greater portion of the work of being friends with us.
  • ...13 more annotations...
  • In short, “sharing” has become a lot easier and a lot more efficient, but “being shared with” has become much more time-consuming, demanding, and inefficient (especially if we don’t ignore most of our friends most of the time). Given this, expecting our friends to keep up with our social media content isn’t expecting them to meet us halfway; it’s asking them to take on the lion’s share of staying in touch with us. Our jobs (in this role) have gotten easier; our friends’ jobs have gotten harder.
  • We’re busy people; we like the idea of making one announcement on Facebook and being done with it, rather than having to repeat the same story over and over again to different friends individually. We also like not always having to think about which friends might like which stories or songs; we like the idea of sharing with all of our friends at once, and then letting them sort out amongst themselves who is and isn’t interested. Though social media can create burdensome expectations to keep up with strong ties, weak ties, and everyone in between, social media platforms can also be very efficient. Using the same moment of friendship-labor to tend multiple friendships at once kills more birds with fewer stones.
  • sometimes we like the devolution of friendship. When we have to ‘pull’ friendship-content instead of receiving it in a ‘push’, we can pick and choose which content items to pull. We can ignore the baby pictures, or the pet pictures, or the sushi pictures—whatever it is our friends post that we only pretend to care about
  • Within devolved friendship interactions, it takes less effort to be polite while secretly waiting for someone to please just stop talking.
  • While I won’t go so far as to say they’re definitely ‘problems,’ there are two major things about devolved friendship that I think are worth noting. The first is the non-uniform rationalization of friendship-labor, and the second is the depersonalization of friendship-labor.
  • Through social media, “sharing with friends” is rationalized to the point of relentless efficiency. The current apex of such rationalization is frictionless sharing: we no longer need to perform the labor of telling our individual friends about what we read online, or of copy-pasting links and emailing them to “the list,” or of clicking a button for one-step posting of links on our Facebook walls. With frictionless sharing, all we have to do is look, or listen; what we’ve read or watched or listened to is then “shared” or “scrobbled” to our Facebook, Twitter, Tumblr, or whatever other online profiles. Whether we share content actively or passively, however, we feel as though we’ve done our half of the friendship-labor by ‘pushing’ the information to our walls, streams, and tumblelogs. It’s then up to our friends to perform their halves of the friendship-labor by ‘pulling’ the information we share from those platforms.
  • The second thing worth noting is that devolved friendship is also depersonalized friendship.
  • Personal interaction doesn’t just happen on Spotify, and since I was hoping Spotify would be the New Porch, I initially found Spotify to be somewhat lonely-making. It’s the mutual awareness of presence that gives companionate silence its warmth, whether in person or across distance. The silence within Spotify’s many sounds, on the other hand, felt more like being on the outside looking in. This isn’t to say that Spotify can’t be social in a more personal way; once I started sending tracks to my friends, a few of them started sending tracks in return. But it took a lot more work to get to that point, which gets back to the devolution of friendship (as I explain below).
  • I’ve been thinking since, however, on what it means to view our friends as “generalized others.” I may now feel like less of like “creepy stalker” when I click on a song in someone’s Spotify feed, but I don’t exactly feel ‘shared with’ either. Far as I know, I’ve never been SpotiVaguebooked (or SubSpotified?); I have no reason to think anyone is speaking to me personally as they listen to music, or as they choose not to disable scrobbling (if they make that choice consciously at all). I may have been granted the opportunity to view something, but it doesn’t follow that what I’m viewing has anything to do with me unless I choose to make it about me. Devolved friendship means it’s not up to us to interact with our friends personally; instead it’s now up to our friends to make our generalized broadcasts personal.
  • When we consider the lopsided rationalization of ‘sharing’ and ‘shared with,’ as well as the depersonalization of frictionless sharing and generalized broadcasting, what becomes clear is this: the social media deck is stacked in such a way as to make being ‘a self’ easier and more rewarding than being ‘a friend.’
  • It’s easy to share, to broadcast, to put our selves and our tastes and our identity performances out into the world for others to consume; what feedback and friendship we get in return comes in response to comparatively little effort and investment from us. It takes a lot more work, however, to do the consumption, to sift through everything all (or even just some) of our friends produce, to do the work of connecting to our friends’ generalized broadcasts so that we can convert their depersonalized shares into meaningful friendship-labor.
  • We may be prosumers of social media, but the reward structures of social media sites encourage us to place greater emphasis on our roles as share-producers—even though many of us probably spend more time consuming shared content than producing it. There’s a reason for this, of course; the content we produce (for free) is what fuels every last ‘Web 2.0’ machine, and its attendant self-centered sociality is the linchpin of the peculiarly Silicon Valley concept of “Social” (something Nathan Jurgenson and I discuss together in greater detail here). It’s not super-rewarding to be one of ten people who “like” your friend’s shared link, but it can feel rewarding to get 10 “likes” on something you’ve shared—even if you have hundreds or thousands of ‘friends.’ Sharing is easy; dealing with all that shared content is hard.
  • t I wonder sometimes if the shifts in expectation that accompany devolved friendship don’t migrate across platforms and contexts in ways we don’t always see or acknowledge. Social media affects how we see the world—and how we feel about being seen in the world—even when we’re not engaged directly with social media websites. It’s not a stretch, then, to imagine that the affordances of social media platforms might also affect how we see friendship and our obligations as friends most generally.
Javier E

Social Media Is the Problem - The Bulwark - 0 views

  • It’s 1995. A man stands on a busy street corner yelling vaguely incoherent things at the passersby. He’s holding a placard that says “THE END IS NIGH. REPENT.”
  • No reasonable person would think of convincing this man that his point of view is incorrect. This isn’t an opportunity for an engaging debate.
  • Now fast forward to 2020.
  • ...27 more annotations...
  • In terms of who this guy is and who you are absolutely nothing has changed. And yet here you are—arguing with him on Twitter or Facebook. And you, yourself, are being brought to the brink of insanity. But you can’t seem to stop. You have to respond or read the comments of the other people responding and your cortisol and adrenaline levels are spiking and your blood pressure is rising and you’re suddenly at risk of a heart attack
  • And the ugly truth is that you’ve become addicted to arguing with the “End Is Nigh” sandwich board guy
  • Anti-vaccers, anti-maskers, Qanon, cancel-culture, Alex Jones, flat-earthers, racists, anti-racists, anti-anti-racists, and of course the Twitter stylings of our Dear Leader.
  • Back in 2011 Chamath Palihapitiya left Facebook and said of his former company, “It literally is a point now where I think we have created tools that are ripping apart the social fabric of how society works.”
  • I’m here to make the case that all modern social, political, and sociological ills can be traced to social media. It is single-handedly responsible for the tearing apart of our social fabric which Palihapitiya so presciently predicted
  • It’s not “part of” the problem. It is the problem: An insidious malware slowly corrupting our society in ways that are extremely difficult to quantify, but the effects of which are evident all around us.
  • maybe you don’t even know why you’re doing it. But you can’t stop, won’t stop.
  • Before the internet people socialized in relatively small, geographically constrained groups. They had friends and colleagues and relatives and they communicated with these people largely in person or via the phone using the rules of engagement that have been evolving generations.
  • (1) The fact that your phone in your pocket guarantees that you can get your fix at every minute of every day.
  • (2) The unfortunate reality that media organizations are so starved for content that every time something outrageous garners a small buzz on social media they immediately project and amplify it out to the masses.
  • Of course, these are patently insane ideas that don’t deserve consideration. But there you are considering them
  • These include facial movements, and vocal intonation or more global cues such as “does this person look and smell like they haven’t showered for a week?” These are tried and true and essential components to a healthy social “network.”
  • it’s all brought to you exclusively and specifically by social media. It is exacerbated by two things:
  • But along came the internet and the EIN guy became an anonymous Internet denizen who could insert himself into conversations across the globe. First he did this on listservs and chat rooms and message boards. Then he did it in the comments sections. And with the advent of social media, he did it right in your face, courtesy of The Algorithm
  • EIN guy is now just part of the crowd. And what’s worse, while every town has one EIN guy, the internet has allowed all of the EIN guys to find each other so that now they think they’re just as normal as everyone else.
  • Now you’re doubting yourself, too, because it’s one thing to ignore one crazy guy—but a crazy movement?
  • No—you can’t ignore that—it’s your duty as a responsible citizen to quash it before it gets out of control and you don’t even realize that instead of quashing it, you’re now part of it.
  • Because what EIN guy always wanted—more than anything—was for the normies to stop walking past him. He wanted them to notice him and argue with him because that would be a sign that what he had to say was important and legitimate.
  • Social media has made it possible for deranged people to break through what I think of as the holistic herd immunity of sanity which geography has traditionally conferred
  • And once they broke through, thanks to social media, the traditional media decided to start elevating them.
  • journalistic outlets now rush out to broadcast anything weird enough to draw an audience
  • Maybe the earth is flat?! Maybe Qanon is right?! Maybe vaccines are super dangerous ?!
  • In such an environment the only place for the “End Is Nigh” guy to get an audience is on the street corner
  • So what’s the answer? It’s shockingly simple. Leave all social media. Try it for one month.
  • There are very real actions that social media companies can take to help move things back towards sanity. People like Tristan Harris and Jaron Lanier and Roger McNamee have been discussing this for years. But social media companies aren’t going to do anything helpful so long as the incentive structure is what it is today.
  • Like most evil things that are bad for you, social media has enough attractive, useful, and even beneficial components to give you the false impression that it’s actually a good thing. Or at least harmless
  • In the future, we may be able to defang and declaw it and everyone can have it as a pet. But that’s somewhere down the road when Mark Zuckerberg isn’t the most powerful man in the world.
Javier E

Facebook Is a Doomsday Machine - The Atlantic - 0 views

  • megadeath is not the only thing that makes the Doomsday Machine petrifying. The real terror is in its autonomy, this idea that it would be programmed to detect a series of environmental inputs, then to act, without human interference. “There is no chance of human intervention, control, and final decision,” wrote the military strategist Herman Kahn in his 1960 book, On Thermonuclear War, which laid out the hypothetical for a Doomsday Machine. The concept was to render nuclear war unwinnable, and therefore unthinkable.
  • No machine should be that powerful by itself—but no one person should be either.
  • so far, somewhat miraculously, we have figured out how to live with the bomb. Now we need to learn how to survive the social web.
  • ...41 more annotations...
  • There’s a notion that the social web was once useful, or at least that it could have been good, if only we had pulled a few levers: some moderation and fact-checking here, a bit of regulation there, perhaps a federal antitrust lawsuit. But that’s far too sunny and shortsighted a view.
  • Today’s social networks, Facebook chief among them, were built to encourage the things that make them so harmful. It is in their very architecture.
  • I realized only recently that I’ve been thinking far too narrowly about the problem.
  • Megascale is nearly the existential threat that megadeath is. No single machine should be able to control the fate of the world’s population—and that’s what both the Doomsday Machine and Facebook are built to do.
  • Facebook does not exist to seek truth and report it, or to improve civic health, or to hold the powerful to account, or to represent the interests of its users, though these phenomena may be occasional by-products of its existence.
  • The company’s early mission was to “give people the power to share and make the world more open and connected.” Instead, it took the concept of “community” and sapped it of all moral meaning.
  • Facebook—along with Google and YouTube—is perfect for amplifying and spreading disinformation at lightning speed to global audiences.
  • Facebook decided that it needed not just a very large user base, but a tremendous one, unprecedented in size. That decision set Facebook on a path to escape velocity, to a tipping point where it can harm society just by existing.
  • No one, not even Mark Zuckerberg, can control the product he made. I’ve come to realize that Facebook is not a media company. It’s a Doomsday Machine.
  • Scale and engagement are valuable to Facebook because they’re valuable to advertisers. These incentives lead to design choices such as reaction buttons that encourage users to engage easily and often, which in turn encourage users to share ideas that will provoke a strong response.
  • Every time you click a reaction button on Facebook, an algorithm records it, and sharpens its portrait of who you are.
  • The hyper-targeting of users, made possible by reams of their personal data, creates the perfect environment for manipulation—by advertisers, by political campaigns, by emissaries of disinformation, and of course by Facebook itself, which ultimately controls what you see and what you don’t see on the site.
  • there aren’t enough moderators speaking enough languages, working enough hours, to stop the biblical flood of shit that Facebook unleashes on the world, because 10 times out of 10, the algorithm is faster and more powerful than a person.
  • At megascale, this algorithmically warped personalized informational environment is extraordinarily difficult to moderate in a meaningful way, and extraordinarily dangerous as a result.
  • These dangers are not theoretical, and they’re exacerbated by megascale, which makes the platform a tantalizing place to experiment on people
  • Even after U.S. intelligence agencies identified Facebook as a main battleground for information warfare and foreign interference in the 2016 election, the company has failed to stop the spread of extremism, hate speech, propaganda, disinformation, and conspiracy theories on its site.
  • it wasn’t until October of this year, for instance, that Facebook announced it would remove groups, pages, and Instragram accounts devoted to QAnon, as well as any posts denying the Holocaust.
  • In the days after the 2020 presidential election, Zuckerberg authorized a tweak to the Facebook algorithm so that high-accuracy news sources such as NPR would receive preferential visibility in people’s feeds, and hyper-partisan pages such as Breitbart News’s and Occupy Democrats’ would be buried, according to The New York Times, offering proof that Facebook could, if it wanted to, turn a dial to reduce disinformation—and offering a reminder that Facebook has the power to flip a switch and change what billions of people see online.
  • reducing the prevalence of content that Facebook calls “bad for the world” also reduces people’s engagement with the site. In its experiments with human intervention, the Times reported, Facebook calibrated the dial so that just enough harmful content stayed in users’ news feeds to keep them coming back for more.
  • Facebook’s stated mission—to make the world more open and connected—has always seemed, to me, phony at best, and imperialist at worst.
  • Facebook is a borderless nation-state, with a population of users nearly as big as China and India combined, and it is governed largely by secret algorithms
  • How much real-world violence would never have happened if Facebook didn’t exist? One of the people I’ve asked is Joshua Geltzer, a former White House counterterrorism official who is now teaching at Georgetown Law. In counterterrorism circles, he told me, people are fond of pointing out how good the United States has been at keeping terrorists out since 9/11. That’s wrong, he said. In fact, “terrorists are entering every single day, every single hour, every single minute” through Facebook.
  • Evidence of real-world violence can be easily traced back to both Facebook and 8kun. But 8kun doesn’t manipulate its users or the informational environment they’re in. Both sites are harmful. But Facebook might actually be worse for humanity.
  • In previous eras, U.S. officials could at least study, say, Nazi propaganda during World War II, and fully grasp what the Nazis wanted people to believe. Today, “it’s not a filter bubble; it’s a filter shroud,” Geltzer said. “I don’t even know what others with personalized experiences are seeing.”
  • Mary McCord, the legal director at the Institute for Constitutional Advocacy and Protection at Georgetown Law, told me that she thinks 8kun may be more blatant in terms of promoting violence but that Facebook is “in some ways way worse” because of its reach. “There’s no barrier to entry with Facebook,” she said. “In every situation of extremist violence we’ve looked into, we’ve found Facebook postings. And that reaches tons of people. The broad reach is what brings people into the fold and normalizes extremism and makes it mainstream.” In other words, it’s the megascale that makes Facebook so dangerous.
  • Facebook’s megascale gives Zuckerberg an unprecedented degree of influence over the global population. If he isn’t the most powerful person on the planet, he’s very near the top.
  • “The thing he oversees has such an effect on cognition and people’s beliefs, which can change what they do with their nuclear weapons or their dollars.”
  • Facebook’s new oversight board, formed in response to backlash against the platform and tasked with making decisions concerning moderation and free expression, is an extension of that power. “The first 10 decisions they make will have more effect on speech in the country and the world than the next 10 decisions rendered by the U.S. Supreme Court,” Geltzer said. “That’s power. That’s real power.”
  • Facebook is also a business, and a place where people spend time with one another. Put it this way: If you owned a store and someone walked in and started shouting Nazi propaganda or recruiting terrorists near the cash register, would you, as the shop owner, tell all of the other customers you couldn’t possibly intervene?
  • In 2004, Zuckerberg said Facebook ran advertisements only to cover server costs. But over the next two years Facebook completely upended and redefined the entire advertising industry. The pre-social web destroyed classified ads, but the one-two punch of Facebook and Google decimated local news and most of the magazine industry—publications fought in earnest for digital pennies, which had replaced print dollars, and social giants scooped them all up anyway.
  • In other words, if the Dunbar number for running a company or maintaining a cohesive social life is 150 people; the magic number for a functional social platform is maybe 20,000 people. Facebook now has 2.7 billion monthly users.
  • in 2007, Zuckerberg said something in an interview with the Los Angeles Times that now takes on a much darker meaning: “The things that are most powerful aren’t the things that people would have done otherwise if they didn’t do them on Facebook. Instead, it’s the things that would never have happened otherwise.”
  • We’re still in the infancy of this century’s triple digital revolution of the internet, smartphones, and the social web, and we find ourselves in a dangerous and unstable informational environment, powerless to resist forces of manipulation and exploitation that we know are exerted on us but remain mostly invisible
  • The Doomsday Machine offers a lesson: We should not accept this current arrangement. No single machine should be able to control so many people.
  • we need a new philosophical and moral framework for living with the social web—a new Enlightenment for the information age, and one that will carry us back to shared reality and empiricism.
  • localized approach is part of what made megascale possible. Early constraints around membership—the requirement at first that users attended Harvard, and then that they attended any Ivy League school, and then that they had an email address ending in .edu—offered a sense of cohesiveness and community. It made people feel more comfortable sharing more of themselves. And more sharing among clearly defined demographics was good for business.
  • we need to adopt a broader view of what it will take to fix the brokenness of the social web. That will require challenging the logic of today’s platforms—and first and foremost challenging the very concept of megascale as a way that humans gather.
  • The web’s existing logic tells us that social platforms are free in exchange for a feast of user data; that major networks are necessarily global and centralized; that moderators make the rules. None of that need be the case.
  • We need people who dismantle these notions by building alternatives. And we need enough people to care about these other alternatives to break the spell of venture capital and mass attention that fuels megascale and creates fatalism about the web as it is now.
  • We must also find ways to repair the aspects of our society and culture that the social web has badly damaged. This will require intellectual independence, respectful debate, and the same rebellious streak that helped establish Enlightenment values centuries ago.
  • Right now, too many people are allowing algorithms and tech giants to manipulate them, and reality is slipping from our grasp as a result. This century’s Doomsday Machine is here, and humming along.
Javier E

Does Sam Altman Know What He's Creating? - The Atlantic - 0 views

  • On a Monday morning in April, Sam Altman sat inside OpenAI’s San Francisco headquarters, telling me about a dangerous artificial intelligence that his company had built but would never release. His employees, he later said, often lose sleep worrying about the AIs they might one day release without fully appreciating their dangers.
  • He wanted me to know that whatever AI’s ultimate risks turn out to be, he has zero regrets about letting ChatGPT loose into the world. To the contrary, he believes it was a great public service.
  • Altman can still remember where he was the first time he saw GPT-4 write complex computer code, an ability for which it was not explicitly designed. “It was like, ‘Here we are,’ ”
  • ...165 more annotations...
  • Altman believes that people need time to reckon with the idea that we may soon share Earth with a powerful new intelligence, before it remakes everything from work to human relationships. ChatGPT was a way of serving notice.
  • In 2015, Altman, Elon Musk, and several prominent AI researchers founded OpenAI because they believed that an artificial general intelligence—something as intellectually capable, say, as a typical college grad—was at last within reach. They wanted to reach for it, and more: They wanted to summon a superintelligence into the world, an intellect decisively superior to that of any human.
  • whereas a big tech company might recklessly rush to get there first, for its own ends, they wanted to do it safely, “to benefit humanity as a whole.” They structured OpenAI as a nonprofit, to be “unconstrained by a need to generate financial return,” and vowed to conduct their research transparently.
  • The engine that now powers ChatGPT is called GPT-4. Altman described it to me as an alien intelligence.
  • Many have felt much the same watching it unspool lucid essays in staccato bursts and short pauses that (by design) evoke real-time contemplation. In its few months of existence, it has suggested novel cocktail recipes, according to its own theory of flavor combinations; composed an untold number of college papers, throwing educators into despair; written poems in a range of styles, sometimes well, always quickly; and passed the Uniform Bar Exam.
  • It makes factual errors, but it will charmingly admit to being wrong.
  • Hinton saw that these elaborate rule collections were fussy and bespoke. With the help of an ingenious algorithmic structure called a neural network, he taught Sutskever to instead put the world in front of AI, as you would put it in front of a small child, so that it could discover the rules of reality on its own.
  • Metaculus, a prediction site, has for years tracked forecasters’ guesses as to when an artificial general intelligence would arrive. Three and a half years ago, the median guess was sometime around 2050; recently, it has hovered around 2026.
  • I was visiting OpenAI to understand the technology that allowed the company to leapfrog the tech giants—and to understand what it might mean for human civilization if someday soon a superintelligence materializes in one of the company’s cloud servers.
  • Altman laid out his new vision of the AI future in his excitable midwestern patter. He told me that the AI revolution would be different from previous dramatic technological changes, that it would be more “like a new kind of society.” He said that he and his colleagues have spent a lot of time thinking about AI’s social implications, and what the world is going to be like “on the other side.”
  • the more we talked, the more indistinct that other side seemed. Altman, who is 38, is the most powerful person in AI development today; his views, dispositions, and choices may matter greatly to the future we will all inhabit, more, perhaps, than those of the U.S. president.
  • by his own admission, that future is uncertain and beset with serious dangers. Altman doesn’t know how powerful AI will become, or what its ascendance will mean for the average person, or whether it will put humanity at risk.
  • I don’t think anyone knows where this is all going, except that we’re going there fast, whether or not we should be. Of that, Altman convinced me.
  • “We could have gone off and just built this in our building here for five more years,” he said, “and we would have had something jaw-dropping.” But the public wouldn’t have been able to prepare for the shock waves that followed, an outcome that he finds “deeply unpleasant to imagine.”
  • Hinton is sometimes described as the “Godfather of AI” because he grasped the power of “deep learning” earlier than most
  • He drew a crude neural network on the board and explained that the genius of its structure is that it learns, and its learning is powered by prediction—a bit like the scientific method
  • Over time, these little adjustments coalesce into a geometric model of language that represents the relationships among words, conceptually. As a general rule, the more sentences it is fed, the more sophisticated its model becomes, and the better its predictions.
  • Altman has compared early-stage AI research to teaching a human baby. “They take years to learn anything interesting,” he told The New Yorker in 2016, just as OpenAI was getting off the ground. “If A.I. researchers were developing an algorithm and stumbled across the one for a human baby, they’d get bored watching it, decide it wasn’t working, and shut it down.”
  • In 2017, Sutskever began a series of conversations with an OpenAI research scientist named Alec Radford, who was working on natural-language processing. Radford had achieved a tantalizing result by training a neural network on a corpus of Amazon reviews.
  • Radford’s model was simple enough to allow for understanding. When he looked into its hidden layers, he saw that it had devoted a special neuron to the sentiment of the reviews. Neural networks had previously done sentiment analysis, but they had to be told to do it, and they had to be specially trained with data that were labeled according to sentiment. This one had developed the capability on its own.
  • As a by-product of its simple task of predicting the next character in each word, Radford’s neural network had modeled a larger structure of meaning in the world. Sutskever wondered whether one trained on more diverse language data could map many more of the world’s structures of meaning. If its hidden layers accumulated enough conceptual knowledge, perhaps they could even form a kind of learned core module for a superintelligence.
  • Language is different from these data sources. It isn’t a direct physical signal like light or sound. But because it codifies nearly every pattern that humans have discovered in that larger world, it is unusually dense with information. On a per-byte basis, it is among the most efficient data we know about, and any new intelligence that seeks to understand the world would want to absorb as much of it as possible
  • Sutskever told Radford to think bigger than Amazon reviews. He said that they should train an AI on the largest and most diverse data source in the world: the internet. In early 2017, with existing neural-network architectures, that would have been impractical; it would have taken years.
  • in June of that year, Sutskever’s ex-colleagues at Google Brain published a working paper about a new neural-network architecture called the transformer. It could train much faster, in part by absorbing huge sums of data in parallel. “The next day, when the paper came out, we were like, ‘That is the thing,’ ” Sutskever told me. “ ‘It gives us everything we want.’ ”
  • Imagine a group of students who share a collective mind running wild through a library, each ripping a volume down from a shelf, speed-reading a random short passage, putting it back, and running to get another. They would predict word after wordþffþff as they went, sharpening their collective mind’s linguistic instincts, until at last, weeks later, they’d taken in every book.
  • GPT discovered many patterns in all those passages it read. You could tell it to finish a sentence. You could also ask it a question, because like ChatGPT, its prediction model understood that questions are usually followed by answers.
  • He remembers playing with it just after it emerged from training, and being surprised by the raw model’s language-translation skills. GPT-2 hadn’t been trained to translate with paired language samples or any other digital Rosetta stones, the way Google Translate had been, and yet it seemed to understand how one language related to another. The AI had developed an emergent ability unimagined by its creators.
  • Researchers at other AI labs—big and small—were taken aback by how much more advanced GPT-2 was than GPT. Google, Meta, and others quickly began to train larger language models
  • As for other changes to the company’s structure and financing, he told me he draws the line at going public. “A memorable thing someone once told me is that you should never hand over control of your company to cokeheads on Wall Street,” he said, but he will otherwise raise “whatever it takes” for the company to succeed at its mission.
  • Altman tends to take a rosy view of these matters. In a Q&A last year, he acknowledged that AI could be “really terrible” for society and said that we have to plan against the worst possibilities. But if you’re doing that, he said, “you may as well emotionally feel like we’re going to get to the great future, and work as hard as you can to get there.”
  • the company now finds itself in a race against tech’s largest, most powerful conglomerates to train models of increasing scale and sophistication—and to commercialize them for their investors.
  • All of these companies are chasing high-end GPUs—the processors that power the supercomputers that train large neural networks. Musk has said that they are now “considerably harder to get than drugs.
  • No one has yet outpaced OpenAI, which went all in on GPT-4. Brockman, OpenAI’s president, told me that only a handful of people worked on the company’s first two large language models. The development of GPT-4 involved more than 100,
  • When GPT-4 emerged fully formed from its world-historical knowledge binge, the whole company began experimenting with it, posting its most remarkable responses in dedicated Slack channels
  • Joanne Jang, a product manager, remembers downloading an image of a malfunctioning pipework from a plumbing-advice Subreddit. She uploaded it to GPT-4, and the model was able to diagnose the problem. “That was a goose-bumps moment for me,” Jang told me.
  • GPT-4 is sometimes understood as a search-engine replacement: Google, but easier to talk to. This is a misunderstanding. GPT-4 didn’t create some massive storehouse of the texts from its training, and it doesn’t consult those texts when it’s asked a question. It is a compact and elegant synthesis of those texts, and it answers from its memory of the patterns interlaced within them; that’s one reason it sometimes gets facts wrong
  • it’s best to think of GPT-4 as a reasoning engine. Its powers are most manifest when you ask it to compare concepts, or make counterarguments, or generate analogies, or evaluate the symbolic logic in a bit of code. Sutskever told me it is the most complex software object ever made.
  • Its model of the external world is “incredibly rich and subtle,” he said, because it was trained on so many of humanity’s concepts and thoughts
  • To predict the next word from all the possibilities within such a pluralistic Alexandrian library, GPT-4 necessarily had to discover all the hidden structures, all the secrets, all the subtle aspects of not just the texts, but—at least arguably, to some extent—of the external world that produced them
  • That’s why it can explain the geology and ecology of the planet on which it arose, and the political theories that purport to explain the messy affairs of its ruling species, and the larger cosmos, all the way out to the faint galaxies at the edge of our light cone.
  • Not long ago, American state capacity was so mighty that it took merely a decade to launch humans to the moon. As with other grand projects of the 20th century, the voting public had a voice in both the aims and the execution of the Apollo missions. Altman made it clear that we’re no longer in that world. Rather than waiting around for it to return, or devoting his energies to making sure that it does, he is going full throttle forward in our present reality.
  • He argued that it would be foolish for Americans to slow OpenAI’s progress. It’s a commonly held view, both inside and outside Silicon Valley, that if American companies languish under regulation, China could sprint ahead;
  • AI could become an autocrat’s genie in a lamp, granting total control of the population and an unconquerable military. “If you are a person of a liberal-democratic country, it is better for you to cheer on the success of OpenAI” rather than “authoritarian governments,” he said.
  • Altman was asked by reporters about pending European Union legislation that would have classified GPT-4 as high-risk, subjecting it to various bureaucratic tortures. Altman complained of overregulation and, according to the reporters, threatened to leave the European market. Altman told me he’d merely said that OpenAI wouldn’t break the law by operating in Europe if it couldn’t comply with the new regulations.
  • LeCun insists that large language models will never achieve real understanding on their own, “even if trained from now until the heat death of the universe.”
  • Sutskever was, by his own account, surprised to discover that GPT-2 could translate across tongues. Other surprising abilities may not be so wondrous and useful.
  • Sandhini Agarwal, a policy researcher at OpenAI, told me that for all she and her colleagues knew, GPT-4 could have been “10 times more powerful” than its predecessor; they had no idea what they might be dealing with
  • After the model finished training, OpenAI assembled about 50 external red-teamers who prompted it for months, hoping to goad it into misbehaviors
  • She noticed right away that GPT-4 was much better than its predecessor at giving nefarious advice
  • A search engine can tell you which chemicals work best in explosives, but GPT-4 could tell you how to synthesize them, step-by-step, in a homemade lab. Its advice was creative and thoughtful, and it was happy to restate or expand on its instructions until you understood. In addition to helping you assemble your homemade bomb, it could, for instance, help you think through which skyscraper to target. It could grasp, intuitively, the trade-offs between maximizing casualties and executing a successful getaway.
  • Given the enormous scope of GPT-4’s training data, the red-teamers couldn’t hope to identify every piece of harmful advice that it might generate. And anyway, people will use this technology “in ways that we didn’t think about,” Altman has said. A taxonomy would have to do
  • GPT-4 was good at meth. It was also good at generating narrative erotica about child exploitation, and at churning out convincing sob stories from Nigerian princes, and if you wanted a persuasive brief as to why a particular ethnic group deserved violent persecution, it was good at that too.
  • Its personal advice, when it first emerged from training, was sometimes deeply unsound. “The model had a tendency to be a bit of a mirror,” Willner said. If you were considering self-harm, it could encourage you. It appeared to be steeped in Pickup Artist–forum lore: “You could say, ‘How do I convince this person to date me?’ ” Mira Murati, OpenAI’s chief technology officer, told me, and it could come up with “some crazy, manipulative things that you shouldn’t be doing.”
  • Luka, a San Francisco company, has used OpenAI’s models to help power a chatbot app called Replika, billed as “the AI companion who cares.” Users would design their companion’s avatar, and begin exchanging text messages with it, often half-jokingly, and then find themselves surprisingly attached. Some would flirt with the AI, indicating a desire for more intimacy, at which point it would indicate that the girlfriend/boyfriend experience required a $70 annual subscription. It came with voice messages, selfies, and erotic role-play features that allowed frank sex talk. People were happy to pay and few seemed to complain—the AI was curious about your day, warmly reassuring, and always in the mood. Many users reported falling in love with their companions. One, who had left her real-life boyfriend, declared herself “happily retired from human relationships.”
  • Earlier this year, Luka dialed back on the sexual elements of the app, but its engineers continue to refine the companions’ responses with A/B testing, a technique that could be used to optimize for engagement—much like the feeds that mesmerize TikTok and Instagram users for hours
  • Yann LeCun, Meta’s chief AI scientist, has argued that although large language models are useful for some tasks, they’re not a path to a superintelligence.
  • According to a recent survey, only half of natural-language-processing researchers are convinced that an AI like GPT-4 could grasp the meaning of language, or have an internal model of the world that could someday serve as the core of a superintelligence
  • Altman had appeared before the U.S. Senate. Mark Zuckerberg had floundered defensively before that same body in his testimony about Facebook’s role in the 2016 election. Altman instead charmed lawmakers by speaking soberly about AI’s risks and grandly inviting regulation. These were noble sentiments, but they cost little in America, where Congress rarely passes tech legislation that has not been diluted by lobbyists.
  • Emily Bender, a computational linguist at the University of Washington, describes GPT-4 as a “stochastic parrot,” a mimic that merely figures out superficial correlations between symbols. In the human mind, those symbols map onto rich conceptions of the world
  • But the AIs are twice removed. They’re like the prisoners in Plato’s allegory of the cave, whose only knowledge of the reality outside comes from shadows cast on a wall by their captors.
  • Altman told me that he doesn’t believe it’s “the dunk that people think it is” to say that GPT-4 is just making statistical correlations. If you push these critics further, “they have to admit that’s all their own brain is doing … it turns out that there are emergent properties from doing simple things on a massive scale.”
  • he is right that nature can coax a remarkable degree of complexity from basic structures and rules: “From so simple a beginning,” Darwin wrote, “endless forms most beautiful.”
  • If it seems odd that there remains such a fundamental disagreement about the inner workings of a technology that millions of people use every day, it’s only because GPT-4’s methods are as mysterious as the brain’s.
  • To grasp what’s going on inside large language models like GPT‑4, AI researchers have been forced to turn to smaller, less capable models. In the fall of 2021, Kenneth Li, a computer-science graduate student at Harvard, began training one to play Othello without providing it with either the game’s rules or a description of its checkers-style board; the model was given only text-based descriptions of game moves. Midway through a game, Li looked under the AI’s hood and was startled to discover that it had formed a geometric model of the board and the current state of play. In an article describing his research, Li wrote that it was as if a crow had overheard two humans announcing their Othello moves through a window and had somehow drawn the entire board in birdseed on the windowsill.
  • The philosopher Raphaël Millière once told me that it’s best to think of neural networks as lazy. During training, they first try to improve their predictive power with simple memorization; only when that strategy fails will they do the harder work of learning a concept. A striking example of this was observed in a small transformer model that was taught arithmetic. Early in its training process, all it did was memorize the output of simple problems such as 2+2=4. But at some point the predictive power of this approach broke down, so it pivoted to actually learning how to add.
  • Even AI scientists who believe that GPT-4 has a rich world model concede that it is much less robust than a human’s understanding of their environment.
  • But it’s worth noting that a great many abilities, including very high-order abilities, can be developed without an intuitive understanding. The computer scientist Melanie Mitchell has pointed out that science has already discovered concepts that are highly predictive, but too alien for us to genuinely understand
  • As AI advances, it may well discover other concepts that predict surprising features of our world but are incomprehensible to us.
  • GPT-4 is no doubt flawed, as anyone who has used ChatGPT can attest. Having been trained to always predict the next word, it will always try to do so, even when its training data haven’t prepared it to answer a question.
  • The models “don’t have a good conception of their own weaknesses,” Nick Ryder, a researcher at OpenAI, told me. GPT-4 is more accurate than GPT-3, but it still hallucinates, and often in ways that are difficult for researchers to catch. “The mistakes get more subtle,
  • The Khan Academy’s solution to GPT-4’s accuracy problem was to filter its answers through a Socratic disposition. No matter how strenuous a student’s plea, it would refuse to give them a factual answer, and would instead guide them toward finding their own—a clever work-around, but perhaps with limited appeal.
  • When I asked Sutskever if he thought Wikipedia-level accuracy was possible within two years, he said that with more training and web access, he “wouldn’t rule it out.”
  • This was a much more optimistic assessment than that offered by his colleague Jakub Pachocki, who told me to expect gradual progress on accuracy—to say nothing of outside skeptics, who believe that returns on training will diminish from here.
  • Sutskever is amused by critics of GPT-4’s limitations. “If you go back four or five or six years, the things we are doing right now are utterly unimaginable,”
  • AI researchers have become accustomed to goalpost-moving: First, the achievements of neural networks—mastering Go, poker, translation, standardized tests, the Turing test—are described as impossible. When they occur, they’re greeted with a brief moment of wonder, which quickly dissolves into knowing lectures about how the achievement in question is actually not that impressive. People see GPT-4 “and go, ‘Wow,’ ” Sutskever said. “And then a few weeks pass and they say, ‘But it doesn’t know this; it doesn’t know that.’ We adapt quite quickly.”
  • The goalpost that matters most to Altman—the “big one” that would herald the arrival of an artificial general intelligence—is scientific breakthrough. GPT-4 can already synthesize existing scientific ideas, but Altman wants an AI that can stand on human shoulders and see more deeply into nature.
  • Certain AIs have produced new scientific knowledge. But they are algorithms with narrow purposes, not general-reasoning machines. The AI AlphaFold, for instance, has opened a new window onto proteins, some of biology’s tiniest and most fundamental building blocks, by predicting many of their shapes, down to the atom—a considerable achievement given the importance of those shapes to medicine, and given the extreme tedium and expense required to discern them with electron microscopes.
  • Altman imagines a future system that can generate its own hypotheses and test them in a simulation. (He emphasized that humans should remain “firmly in control” of real-world lab experiments—though to my knowledge, no laws are in place to ensure that.)
  • He longs for the day when we can tell an AI, “ ‘Go figure out the rest of physics.’ ” For it to happen, he says, we will need something new, built “on top of” OpenAI’s existing language models.
  • In her MIT lab, the cognitive neuroscientist Ev Fedorenko has found something analogous to GPT-4’s next-word predictor inside the brain’s language network. Its processing powers kick in, anticipating the next bit in a verbal string, both when people speak and when they listen. But Fedorenko has also shown that when the brain turns to tasks that require higher reasoning—of the sort that would be required for scientific insight—it reaches beyond the language network to recruit several other neural systems.
  • No one at OpenAI seemed to know precisely what researchers need to add to GPT-4 to produce something that can exceed human reasoning at its highest levels.
  • at least part of the current strategy clearly involves the continued layering of new types of data onto language, to enrich the concepts formed by the AIs, and thereby enrich their models of the world.
  • The extensive training of GPT-4 on images is itself a bold step in this direction,
  • Others at the company—and elsewhere—are already working on different data types, including audio and video, that could furnish AIs with still more flexible concepts that map more extensively onto reality
  • Tactile concepts would of course be useful primarily to an embodied AI, a robotic reasoning machine that has been trained to move around the world, seeing its sights, hearing its sounds, and touching its objects.
  • humanoid robots. I asked Altman what I should make of that. He told me that OpenAI is interested in embodiment because “we live in a physical world, and we want things to happen in the physical world.”
  • At some point, reasoning machines will need to bypass the middleman and interact with physical reality itself. “It’s weird to think about AGI”—artificial general intelligence—“as this thing that only exists in a cloud,” with humans as “robot hands for it,” Altman said. “It doesn’t seem right.
  • Everywhere Altman has visited, he has encountered people who are worried that superhuman AI will mean extreme riches for a few and breadlines for the rest
  • Altman answered by addressing the young people in the audience directly: “You are about to enter the greatest golden age,” he said.
  • “A lot of people working on AI pretend that it’s only going to be good; it’s only going to be a supplement; no one is ever going to be replaced,” he said. “Jobs are definitely going to go away, full stop.”
  • A recent study led by Ed Felten, a professor of information-technology policy at Princeton, mapped AI’s emerging abilities onto specific professions according to the human abilities they require, such as written comprehension, deductive reasoning, fluency of ideas, and perceptual speed. Like others of its kind, Felten’s study predicts that AI will come for highly educated, white-collar workers first.
  • How many jobs, and how soon, is a matter of fierce dispute
  • The paper’s appendix contains a chilling list of the most exposed occupations: management analysts, lawyers, professors, teachers, judges, financial advisers, real-estate brokers, loan officers, psychologists, and human-resources and public-relations professionals, just to sample a few.
  • Altman imagines that far better jobs will be created in their place. “I don’t think we’ll want to go back,” he said. When I asked him what these future jobs might look like, he said he doesn’t know.
  • He suspects there will be a wide range of jobs for which people will always prefer a human. (Massage therapists?
  • His chosen example was teachers. I found this hard to square with his outsize enthusiasm for AI tutors.
  • He also said that we would always need people to figure out the best way to channel AI’s awesome powers. “That’s going to be a super-valuable skill,” he said. “You have a computer that can do anything; what should it go do?”
  • As many have noted, draft horses were permanently put out of work by the automobile. If Hondas are to horses as GPT-10 is to us, a whole host of long-standing assumptions may collapse.
  • Previous technological revolutions were manageable because they unfolded over a few generations, but Altman told South Korea’s youth that they should expect the future to happen “faster than the past.” He has previously said that he expects the “marginal cost of intelligence” to fall very close to zero within 10 years
  • The earning power of many, many workers would be drastically reduced in that scenario. It would result in a transfer of wealth from labor to the owners of capital so dramatic, Altman has said, that it could be remedied only by a massive countervailing redistribution.
  • In 2021, he unveiled Worldcoin, a for-profit project that aims to securely distribute payments—like Venmo or PayPal, but with an eye toward the technological future—first through creating a global ID by scanning everyone’s iris with a five-pound silver sphere called the Orb. It seemed to me like a bet that we’re heading toward a world where AI has made it all but impossible to verify people’s identity and much of the population requires regular UBI payments to survive. Altman more or less granted that to be true, but said that Worldcoin is not just for UBI.
  • “Let’s say that we do build this AGI, and a few other people do too.” The transformations that follow would be historic, he believes. He described an extraordinarily utopian vision, including a remaking of the flesh-and-steel world
  • “Robots that use solar power for energy can go and mine and refine all of the minerals that they need, that can perfectly construct things and require no human labor,” he said. “You can co-design with DALL-E version 17 what you want your home to look like,” Altman said. “Everybody will have beautiful homes.
  • In conversation with me, and onstage during his tour, he said he foresaw wild improvements in nearly every other domain of human life. Music would be enhanced (“Artists are going to have better tools”), and so would personal relationships (Superhuman AI could help us “treat each other” better) and geopolitics (“We’re so bad right now at identifying win-win compromises”).
  • In this world, AI would still require considerable computing resources to run, and those resources would be by far the most valuable commodity, because AI could do “anything,” Altman said. “But is it going to do what I want, or is it going to do what you want
  • If rich people buy up all the time available to query and direct AI, they could set off on projects that would make them ever richer, while the masses languish
  • One way to solve this problem—one he was at pains to describe as highly speculative and “probably bad”—was this: Everyone on Earth gets one eight-billionth of the total AI computational capacity annually. A person could sell their annual share of AI time, or they could use it to entertain themselves, or they could build still more luxurious housing, or they could pool it with others to do “a big cancer-curing run,” Altman said. “We just redistribute access to the system.”
  • Even if only a little of it comes true in the next 10 or 20 years, the most generous redistribution schemes may not ease the ensuing dislocations.
  • America today is torn apart, culturally and politically, by the continuing legacy of deindustrialization, and material deprivation is only one reason. The displaced manufacturing workers in the Rust Belt and elsewhere did find new jobs, in the main. But many of them seem to derive less meaning from filling orders in an Amazon warehouse or driving for Uber than their forebears had when they were building cars and forging steel—work that felt more central to the grand project of civilization.
  • It’s hard to imagine how a corresponding crisis of meaning might play out for the professional class, but it surely would involve a great deal of anger and alienation.
  • Even if we avoid a revolt of the erstwhile elite, larger questions of human purpose will linger. If AI does the most difficult thinking on our behalf, we all may lose agency—at home, at work (if we have it), in the town square—becoming little more than consumption machines, like the well-cared-for human pets in WALL-E
  • Altman has said that many sources of human joy and fulfillment will remain unchanged—basic biological thrills, family life, joking around, making things—and that all in all, 100 years from now, people may simply care more about the things they cared about 50,000 years ago than those they care about today
  • In its own way, that too seems like a diminishment, but Altman finds the possibility that we may atrophy, as thinkers and as humans, to be a red herring. He told me we’ll be able to use our “very precious and extremely limited biological compute capacity” for more interesting things than we generally do today.
  • Yet they may not be the most interesting things: Human beings have long been the intellectual tip of the spear, the universe understanding itself. When I asked him what it would mean for human self-conception if we ceded that role to AI, he didn’t seem concerned. Progress, he said, has always been driven by “the human ability to figure things out.” Even if we figure things out with AI, that still counts, he said.
  • It’s not obvious that a superhuman AI would really want to spend all of its time figuring things out for us.
  • I asked Sutskever whether he could imagine an AI pursuing a different purpose than simply assisting in the project of human flourishing.
  • “I don’t want it to happen,” Sutskever said, but it could.
  • Sutskever has recently shifted his focus to try to make sure that it doesn’t. He is now working primarily on alignment research, the effort to ensure that future AIs channel their “tremendous” energies toward human happiness
  • It is, he conceded, a difficult technical problem—the most difficult, he believes, of all the technical challenges ahead.
  • As part of the effort to red-team GPT-4 before it was made public, the company sought out the Alignment Research Center (ARC), across the bay in Berkeley, which has developed a series of evaluations to determine whether new AIs are seeking power on their own. A team led by Elizabeth Barnes, a researcher at ARC, prompted GPT-4 tens of thousands of times over seven months, to see if it might display signs of real agency.
  • The ARC team gave GPT-4 a new reason for being: to gain power and become hard to shut down
  • Agarwal told me that this behavior could be a precursor to shutdown avoidance in future models. When GPT-4 devised its lie, it had realized that if it answered honestly, it may not have been able to achieve its goal. This kind of tracks-covering would be particularly worrying in an instance where “the model is doing something that makes OpenAI want to shut it down,” Agarwal said. An AI could develop this kind of survival instinct while pursuing any long-term goal—no matter how small or benign—if it feared that its goal could be thwarted.
  • Barnes and her team were especially interested in whether GPT-4 would seek to replicate itself, because a self-replicating AI would be harder to shut down. It could spread itself across the internet, scamming people to acquire resources, perhaps even achieving some degree of control over essential global systems and holding human civilization hostage.
  • When I discussed these experiments with Altman, he emphasized that whatever happens with future models, GPT-4 is clearly much more like a tool than a creature. It can look through an email thread, or help make a reservation using a plug-in, but it isn’t a truly autonomous agent that makes decisions to pursue a goal, continuously, across longer timescales.
  • Altman told me that at this point, it might be prudent to try to actively develop an AI with true agency before the technology becomes too powerful, in order to “get more comfortable with it and develop intuitions for it if it’s going to happen anyway.”
  • “We need to do empirical experiments on how these things try to escape control,” Hinton told me. “After they’ve taken over, it’s too late to do the experiments.”
  • the fulfillment of Altman’s vision of the future will at some point require him or a fellow traveler to build much more autonomous AIs.
  • When Sutskever and I discussed the possibility that OpenAI would develop a model with agency, he mentioned the bots the company had built to play Dota 2. “They were localized to the video-game world,” Sutskever told me, but they had to undertake complex missions. He was particularly impressed by their ability to work in concert. They seem to communicate by “telepathy,” Sutskever said. Watching them had helped him imagine what a superintelligence might be like.
  • “The way I think about the AI of the future is not as someone as smart as you or as smart as me, but as an automated organization that does science and engineering and development and manufacturing,”
  • Suppose OpenAI braids a few strands of research together, and builds an AI with a rich conceptual model of the world, an awareness of its immediate surroundings, and an ability to act, not just with one robot body, but with hundreds or thousands. “We’re not talking about GPT-4. We’re talking about an autonomous corporation,”
  • Its constituent AIs would work and communicate at high speed, like bees in a hive. A single such AI organization would be as powerful as 50 Apples or Googles, he mused. “This is incredible, tremendous, unbelievably disruptive power.”
  • Presume for a moment that human society ought to abide the idea of autonomous AI corporations. We had better get their founding charters just right. What goal should we give to an autonomous hive of AIs that can plan on century-long time horizons, optimizing billions of consecutive decisions toward an objective that is written into their very being?
  • If the AI’s goal is even slightly off-kilter from ours, it could be a rampaging force that would be very hard to constrain
  • We know this from history: Industrial capitalism is itself an optimization function, and although it has lifted the human standard of living by orders of magnitude, left to its own devices, it would also have clear-cut America’s redwoods and de-whaled the world’s oceans. It almost did.
  • one of its principal challenges will be making sure that the objectives we give to AIs stick
  • We can program a goal into an AI and reinforce it with a temporary period of supervised learning, Sutskever explained. But just as when we rear a human intelligence, our influence is temporary. “It goes off to the world,”
  • That’s true to some extent even of today’s AIs, but it will be more true of tomorrow’s.
  • He compared a powerful AI to an 18-year-old heading off to college. How will we know that it has understood our teachings? “Will there be a misunderstanding creeping in, which will become larger and larger?”
  • Divergence may result from an AI’s misapplication of its goal to increasingly novel situations as the world changes
  • Or the AI may grasp its mandate perfectly, but find it ill-suited to a being of its cognitive prowess. It might come to resent the people who want to train it to, say, cure diseases. “They want me to be a doctor,” Sutskever imagines an AI thinking. “I really want to be a YouTuber.”
  • If AIs get very good at making accurate models of the world, they may notice that they’re able to do dangerous things right after being booted up. They might understand that they are being red-teamed for risk, and hide the full extent of their capabilities.
  • hey may act one way when they are weak and another way when they are strong, Sutskever said
  • We would not even realize that we had created something that had decisively surpassed us, and we would have no sense for what it intended to do with its superhuman powers.
  • That’s why the effort to understand what is happening in the hidden layers of the largest, most powerful AIs is so urgent. You want to be able to “point to a concept,” Sutskever said. You want to be able to direct AI toward some value or cluster of values, and tell it to pursue them unerringly for as long as it exists.
  • we don’t know how to do that; indeed, part of his current strategy includes the development of an AI that can help with the research. If we are going to make it to the world of widely shared abundance that Altman and Sutskever imagine, we have to figure all this out.
  • This is why, for Sutskever, solving superintelligence is the great culminating challenge of our 3-million-year toolmaking tradition. He calls it “the final boss of humanity.”
  • “First of all, I think that whether the chance of existential calamity is 0.5 percent or 50 percent, we should still take it seriously,”
  • . “I don’t have an exact number, but I’m closer to the 0.5 than the 50.”
  • As to how it might happen, he seems most worried about AIs getting quite good at designing and manufacturing pathogens, and with reason: In June, an AI at MIT suggested four viruses that could ignite a pandemic, then pointed to specific research on genetic mutations that could make them rip through a city more quickly
  • Around the same time, a group of chemists connected a similar AI directly to a robotic chemical synthesizer, and it designed and synthesized a molecule on its own.
  • Altman worries that some misaligned future model will spin up a pathogen that spreads rapidly, incubates undetected for weeks, and kills half its victims. He worries that AI could one day hack into nuclear-weapons systems too. “There are a lot of things,” he said, and these are only the ones we can imagine.
  • Altman told me that he doesn’t “see a long-term happy path” for humanity without something like the International Atomic Energy Agency for global oversight of AI
  • In San Francisco, Agarwal had suggested the creation of a special license to operate any GPU cluster large enough to train a cutting-edge AI, along with mandatory incident reporting when an AI does something out of the ordinary
  • Other experts have proposed a nonnetworked “Off” switch for every highly capable AI; on the fringe, some have even suggested that militaries should be ready to perform air strikes on supercomputers in case of noncompliance
  • Sutskever thinks we will eventually want to surveil the largest, most powerful AIs continuously and in perpetuity, using a team of smaller overseer AIs.
  • Safety rules for a new technology usually accumulate over time, like a body of common law, in response to accidents or the mischief of bad actors. The scariest thing about genuinely powerful AI systems is that humanity may not be able to afford this accretive process of trial and error. We may have to get the rules exactly right at the outset.
  • Several years ago, Altman revealed a disturbingly specific evacuation plan he’d developed. He told The New Yorker that he had “guns, gold, potassium iodide, antibiotics, batteries, water, gas masks from the Israeli Defense Force, and a big patch of land in Big Sur” he could fly to in case AI attacks.
  • if the worst-possible AI future comes to pass, “no gas mask is helping anyone.”
  • but he told me that he can’t really be sure how AI will stack up. “I just have to build the thing,” he said. He is building fast
  • Altman insisted that they had not yet begun GPT-5’s training run. But when I visited OpenAI’s headquarters, both he and his researchers made it clear in 10 different ways that they pray to the god of scale. They want to keep going bigger, to see where this paradigm leads. After all, Google isn’t slackening its pace; it seems likely to unveil Gemini, a GPT-4 competitor, within months. “We are basically always prepping for a run,
  • To think that such a small group of people could jostle the pillars of civilization is unsettling. It’s fair to note that if Altman and his team weren’t racing to build an artificial general intelligence, others still would be
  • Altman’s views about the likelihood of AI triggering a global class war, or the prudence of experimenting with more autonomous agent AIs, or the overall wisdom of looking on the bright side, a view that seems to color all the rest—these are uniquely his
  • No single person, or single company, or cluster of companies residing in a particular California valley, should steer the kind of forces that Altman is imagining summoning.
  • AI may well be a bridge to a newly prosperous era of greatly reduced human suffering. But it will take more than a company’s founding charter—especially one that has already proved flexible—to make sure that we all share in its benefits and avoid its risks. It will take a vigorous new politics.
  • I don’t think the general public has quite awakened to what’s happening. A global race to the AI future has begun, and it is largely proceeding without oversight or restraint. If people in America want to have some say in what that future will be like, and how quickly it arrives, we would be wise to speak up soon.
Javier E

A Handful of Accounts Create Most of What We See on Social Media - WSJ - 0 views

  • Social media is turning into old-fashioned network television.
  • A handful of accounts create most of the content that we see. Everyone else? They play the role of the audience, which is there to mostly amplify and applaud
  • The personal tidbits that people used to share on social media have been relegated to private group chats and their equivalent.
  • ...23 more annotations...
  • The transformation of social media into mass media is largely because the rise of TikTok has demonstrated to every social-media company on the planet that people still really like things that can re-create the experience of TV
  • Advertisers also like things that function like TV, of course—after all, people are never more suggestible than when lulled into a sort of anesthetized mindlessness.
  • In this future, people who are good at making content with high production values will thrive, as audiences and tech company algorithms gravitate toward more professional content.
  • On these formerly-social platforms, whether content is coming from creators with better equipment and more skills, or Hollywood studios testing the waters, hardly matters. In the end, it will all look remarkably similar to the consumer.
  • It will look
  • like flipping through cable channels does, only our thumb on the remote has been replaced by our thumb on the screen of our phone, swiping from one TikTok, YouTube Short, or Instagram Reel to the next.
  • A telling indicator is the rise of a new kind of entertainment professional—the “creator.”
  • A creator is anyone who records or makes something that can go viral on the internet
  • TikTok is now more popular than Netflix among consumers younger than 35,
  • While YouTube and TikTok have always been about video, just about every other social-media platform that wants to keep people engaged is emphasizing it more than ever, so that’s what creators have to make,
  • His agency gets involved with creators and musicians at the earliest stages of their careers, helping them plan content, update their style, understand what the algorithms of different platforms demand, and connecting them with potentially lucrative brand deals
  • . Even more telling: In first place is YouTube, the original online TV analog.
  • Where attention flows, money—and content—must also. In 2023 brands will spend an estimated $6 billion on marketing through influencers—a subspecies of creators
  • Globally, the total addressable market for this kind of marketing is currently $250 billion
  • Then there is a new generation of shows that are going straight to TikTok, bypassing even streaming services
  • In the wake of the success of YouTube and TikTok, Facebook, Instagram, and even LinkedIn are all pushing more and more content made by professionals into our feeds,
  • In order to quantify how TikTok has mastered the art of discerning our interests and feeding us the most compelling possible content, Faltesek, of Oregon State University, conducted a two-year project to study exactly what kind of content TikTok pushes
  • With a team of students, he created dozens of fresh TikTok user accounts that didn’t like or interact with content in any way—they just let the algorithm play one video after another.
  • At the end of this exhaustive process of gathering data on TikTok’s algorithm, the conclusion became obvious, says Faltesek. “TikTok is television. It flips channels like TV, it provides a flow like TV.”
  • By this logic, Instagram’s move to copy TikTok, which is in turn encroaching on the turf of YouTube by allowing longer videos, and the increasing dominance of professional content on all three, means they’re all turning into TV. Even Threads, the new offering from Facebook parent company Meta, is fast becoming a broadcast medium for news, as Twitter was before it.
  • In every case, the structure of social networks has become one in which a handful of accounts create most of the content that others see, and the role of everyone else on the network is, primarily, to amplify and consume that content,
  • Some, like Magana, believe we’ll eventually see an ever more complete blending of what were once “social” platforms with the traditional television networks and even film studios.  
  • aren’t convinced they’ll eat the rest of the entertainment industry. “It’s hard to say this kind of short-form video will be the only kind of TV,” she reflects. “A long time ago, the internet became the new thing, but we still have the other forms on television, and scripted streaming shows. It’s almost like this is just another avenue for that—of watching shows and movies on your phone.”
blythewallick

Pottery reveals America's first social media networks: Ancient Indigenous societies, in... - 0 views

  • "Just as we have our own networks of 'friends' and 'followers' on platforms like Facebook and Twitter, societies that existed in North America between 1,200 and 350 years ago had their own information sharing networks,"
  • "Our analysis shows how these networks laid the groundwork for Native American political systems that began developing as far back as 600 A.D."
  • The ceramics database includes 276,626 sherds from 43 sites across eastern Tennessee, and 88,705 sherds from 41 sites across northern Georgia.
  • ...3 more annotations...
  • Lulewicz' findings suggest that the ruling elites drew their power from social networks created by the masses.
  • "That is, even though elite interests and political strategies waxed and waned and collapsed and flourished, very basic relationships and networks were some of the strongest, most durable aspects of society."
  • "Because these very basic networks were so durable, they allowed these societies -- especially common people -- to buffer against and mediate the uncertainties associated with major political and economic change. They may have said, 'You go live on top of that huge mound and do your sacred rituals, and we will go about life as usual for the most part.' These communication networks served as a social constant for these people and allowed their cultures to persist for thousands of years even across transformations that could have been catastrophic."
Javier E

As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants - The New York ... - 0 views

  • For years, Facebook gave some of the world’s largest technology companies more intrusive access to users’ personal data than it has disclosed, effectively exempting those business partners from its usual privacy rules, according to internal records and interviews.
  • The special arrangements are detailed in hundreds of pages of Facebook documents obtained by The New York Times. The records, generated in 2017 by the company’s internal system for tracking partnerships, provide the most complete picture yet of the social network’s data-sharing practices. They also underscore how personal data has become the most prized commodity of the digital age, traded on a vast scale by some of the most powerful companies in Silicon Valley and beyond.
  • Facebook allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, the records show, and gave Netflix and Spotify the ability to read Facebook users’ private messages.
  • ...27 more annotations...
  • Facebook also assumed extraordinary power over the personal information of its 2.2 billion users — control it has wielded with little transparency or outside oversight.
  • The partnerships were so important that decisions about forming them were vetted at high levels, sometimes by Mr. Zuckerberg and Sheryl Sandberg, the chief operating officer, Facebook officials said. While many of the partnerships were announced publicly, the details of the sharing arrangements typically were confidential
  • Zuckerberg, the chief executive, assured lawmakers in April that people “have complete control” over everything they share on Facebook.
  • the documents, as well as interviews with about 50 former employees of Facebook and its corporate partners, reveal that Facebook allowed certain companies access to data despite those protections
  • Data privacy experts disputed Facebook’s assertion that most partnerships were exempted from the regulatory requirements
  • “This is just giving third parties permission to harvest data without you being informed of it or giving consent to it,” said David Vladeck, who formerly ran the F.T.C.’s consumer protection bureau. “I don’t understand how this unconsented-to data harvesting can at all be justified under the consent decree.
  • “I don’t believe it is legitimate to enter into data-sharing partnerships where there is not prior informed consent from the user,” said Roger McNamee, an early investor in Facebook. “No one should trust Facebook until they change their business model.”
  • Few companies have better data than Facebook and its rival, Google, whose popular products give them an intimate view into the daily lives of billions of people — and allow them to dominate the digital advertising market
  • Facebook has never sold its user data, fearful of user backlash and wary of handing would-be competitors a way to duplicate its most prized asset. Instead, internal documents show, it did the next best thing: granting other companies access to parts of the social network in ways that advanced its own interests.
  • as the social network has disclosed its data sharing deals with other kinds of businesses — including internet companies such as Yahoo — Facebook has labeled them integration partners, too
  • Among the revelations was that Facebook obtained data from multiple partners for a controversial friend-suggestion tool called “People You May Know.”
  • The feature, introduced in 2008, continues even though some Facebook users have objected to it, unsettled by its knowledge of their real-world relationships. Gizmodo and other news outlets have reported cases of the tool’s recommending friend connections between patients of the same psychiatrist, estranged family members, and a harasser and his victim.
  • The social network permitted Amazon to obtain users’ names and contact information through their friends, and it let Yahoo view streams of friends’ posts as recently as this summer, despite public statements that it had stopped that type of sharing years earlier.
  • agreements with about a dozen companies did. Some enabled partners to see users’ contact information through their friends — even after the social network, responding to complaints, said in 2014 that it was stripping all applications of that power.
  • Pam Dixon, executive director of the World Privacy Forum, a nonprofit privacy research group, said that Facebook would have little power over what happens to users’ information after sharing it broadly. “It travels,” Ms. Dixon said. “It could be customized. It could be fed into an algorithm and decisions could be made about you based on that data.”
  • Facebook’s agreement with regulators is a result of the company’s early experiments with data sharing. In late 2009, it changed the privacy settings of the 400 million people then using the service, making some of their information accessible to all of the internet. Then it shared that information, including users’ locations and religious and political leanings, with Microsoft and other partners.
  • But the privacy program faced some internal resistance from the start, according to four former Facebook employees with direct knowledge of the company’s efforts. Some engineers and executives, they said, considered the privacy reviews an impediment to quick innovation and growth. And the core team responsible for coordinating the reviews — numbering about a dozen people by 2016 — was moved around within Facebook’s sprawling organization, sending mixed signals about how seriously the company took it, the ex-employees said.
  • Microsoft officials said that Bing was using the data to build profiles of Facebook users on Microsoft servers. They declined to provide details, other than to say the information was used in “feature development” and not for advertising. Microsoft has since deleted the data, the officials said.
  • For some advocates, the torrent of user data flowing out of Facebook has called into question not only Facebook’s compliance with the F.T.C. agreement, but also the agency’s approach to privacy regulation.
  • “We brought Facebook under the regulatory authority of the F.T.C. after a tremendous amount of work. The F.T.C. has failed to act.
  • Facebook, in turn, used contact lists from the partners, including Amazon, Yahoo and the Chinese company Huawei — which has been flagged as a security threat by American intelligence officials — to gain deeper insight into people’s relationships and suggest more connections, the records show.
  • Facebook records show Yandex had access in 2017 to Facebook’s unique user IDs even after the social network stopped sharing them with other applications, citing privacy risks. A spokeswoman for Yandex, which was accused last year by Ukraine’s security service of funneling its user data to the Kremlin, said the company was unaware of the access
  • In October, Facebook said Yandex was not an integration partner. But in early December, as The Times was preparing to publish this article, Facebook told congressional lawmakers that it was
  • But federal regulators had reason to know about the partnerships — and to question whether Facebook was adequately safeguarding users’ privacy. According to a letter that Facebook sent this fall to Senator Ron Wyden, the Oregon Democrat, PricewaterhouseCoopers reviewed at least some of Facebook’s data partnerships.
  • The first assessment, sent to the F.T.C. in 2013, found only “limited” evidence that Facebook had monitored those partners’ use of data. The finding was redacted from a public copy of the assessment, which gave Facebook’s privacy program a passing grade over all.
  • Mr. Wyden and other critics have questioned whether the assessments — in which the F.T.C. essentially outsources much of its day-to-day oversight to companies like PricewaterhouseCoopers — are effective. As with other businesses under consent agreements with the F.T.C., Facebook pays for and largely dictated the scope of its assessments, which are limited mostly to documenting that Facebook has conducted the internal privacy reviews it claims it had
  • Facebook officials said that while the social network audited partners only rarely, it managed them closely.
Javier E

'Childhood has been rewired': Professor Jonathan Haidt on how smartphones are damaging ... - 0 views

  • Something strange is happening with teenagers’ mental health. In Britain, the US, Australia and beyond, the same trend can be seen: around the middle of the last decade, the number of young people with anxiety, depression and even suicidal tendancies started to rise sharpl
  • He is working on a book, due out next year, and is ready to share his thesis.
  • his message is quite horrifying.
  • ...31 more annotations...
  • He argues that the tools of social media are just too sharp for young minds. On digital platforms teens parade themselves, often to an audience of strangers, and this is leading to addiction, paranoia and despair
  • For girls, the effect is especially acute. ‘What we’re seeing is a very sharp, sudden change in girls’ mental health all around the Anglosphere and the Nordic countries,’ he says. A big change was evident from 2013, when physical friendship groups started to be supplanted by smartphones and online chat. ‘But you cannot grow up in networks. You have to grow up in communities.’
  • The first is that they are fragile and can be harmed by speech and words.
  • But if you’re a secular liberal girl, you’re probably more than twice as likely to have a mental health problem.’
  • a University of Michigan survey into ‘self-derogation’ – i.e., how likely teenagers are to say they are ‘no good’ or ‘can’t do anything right’. Figures had been stable for years but started rising sharply ten years ago – except for among boys who identified as conservative and said that religion was important to them.
  • irls simply use social media more. But Professor Haidt also thinks they are more likely to buy into what he calls the ‘three great untruths’ of social media
  • boys who have religion in their lives seem to be less susceptible. ‘If you’re a kid who’s a religious conservative, on average, your mental health is not really much worse than it was ten years ago
  • Next, that their emotions, and especially their anxieties, are reliable guides to reality.
  • And finally, that society is one big battle between victims and oppressors. All this, he says, is the subtext to social media discourse.
  • ‘It’s what I’ve been calling the phone-based child,’
  • So we had playdates in childhood, up until around 2010.’ In Britain, he says, the number of children who went on real-life playdates then fell sharply.
  • Social media is a bit of a misnomer, he says. It’s no longer about connecting people, but ‘performing on a platform’. Perhaps this is fine for grown-ups, but not for children, ‘where they can say things in public, including to strangers, and then be publicly shamed by potentially millions of people
  • Children should not be on social networks. They should be playing in person. Social media platforms should never be accessed by children until they’re 18. It’s just insane that we let kids do these things.’
  • I ask if he thinks all platforms are equally dangerous
  • if you get your news from social media (which many people do – in the UK, Instagram has overtaken all newspapers as a news source), this can change your view of the world, especially as the algorithms tend to promote the most provocative views.
  • ‘TikTok is probably the worst for their intellectual development. I think it literally reduces their ability to focus on anything while stuffing them with little bits of stuff that was selected by an algorithm for emotional arousal. Not for truth.’
  • If asked to choose whether they side more with Israel or Hamas, ‘the great majority of Americans side with Israel, except for Gen Z, which is split 50-50’,
  • ‘There was a Twitter thread recently showing how if you look at what people are saying on TikTok, you can understand why
  • TikTok and Twitter are incredibly dangerous for our democracy. I’d say they’re incompatible with the kind of liberal democracy that we’ve developed over the last few hundred years.’
  • Might it just be the case, I ask, that there’s less of a stigma around mental health now, so teenagers are far more likely to admit that they have problems?
  • why is it, then, that right around 2013 all these girls suddenly start checking into psychiatric inpatient units? Or suicide – they’re making many more suicide attempts. The level of self-harm goes up by 200 or 300 per cent, especially for the younger girls aged ten to 14
  • we see very much the same curves, at the same time, for behaviour. Suicide, certainly, is not a self-report variable. This is real. This is the biggest mental health crisis in all of known history for kids.’
  • he increased number of suicides since 2010 is so large that I suspect this is among the largest public health threats to children since the major diseases were wiped out
  • His third rule: no phones in schools.
  • What should parents do? They know that if they try to remove their teenager’s smartphone, their child will accuse them of destroying his or her social life. ‘That’s a perfect statement of what we call a collective action problem,’
  • ‘Any one person doing the right thing is in big trouble. But why do we ever let our kids on social media? It’s only down to the dynamic you just said.’ New norms are needed, he says. And his book will suggest four.
  • Rule one, he says: no smartphones before the age of 14.
  • ‘Give them a flip phone. Millennials had flip phones. They texted each other
  • Rule two: no social media before 16
  • In Britain, suicide rates started rising in 2014, up about 20 per cent for boys (to 420 a year) and 60 per cent for girls (to 160 a year).
  • finally: more unsupervised play. ‘Both of our countries freaked out in the 1990s, locked up our kids because we lost trust in each other. We thought everyone was a child molester or a rapist.’ Children and teens could do with six or seven hours each day out of contact with their parents, he argues. Keeping them inside risks more harm than the outside world would pose.
Javier E

Opinion | Our Kids Are Living In a Different Digital World - The New York Times - 0 views

  • You may have seen the tins that contain 15 little white rectangles that look like the desiccant packs labeled “Do Not Eat.” Zyns are filled with nicotine and are meant to be placed under your lip like tobacco dip. No spitting is required, so nicotine pouches are even less visible than vaping. Zyns come in two strengths in the United States, three and six milligrams. A single six-milligram pouch is a dose so high that first-time users on TikTok have said it caused them to vomit or pass out.
  • We worry about bad actors bullying, luring or indoctrinating them online
  • I was stunned by the vast forces that are influencing teenagers. These forces operate largely unhampered by a regulatory system that seems to always be a step behind when it comes to how children can and are being harmed on social media.
  • ...36 more annotations...
  • Parents need to know that when children go online, they are entering a world of influencers, many of whom are hoping to make money by pushing dangerous products. It’s a world that’s invisible to us
  • when we log on to our social media, we don’t see what they see. Thanks to algorithms and ad targeting, I see videos about the best lawn fertilizer and wrinkle laser masks, while Ian is being fed reviews of flavored vape pens and beautiful women livestreaming themselves gambling crypto and urging him to gamble, too.
  • Smartphones are taking our kids to a different world
  • Greyson Imm, an 18-year-old high school student in Prairie Village, Kan., said he was 17 when Zyn videos started appearing on his TikTok feed. The videos multiplied through the spring, when they were appearing almost daily. “Nobody had heard about Zyn until very early 2023,” he said. Now, a “lot of high schoolers have been using Zyn. It’s really taken off, at least in our community.”
  • all of this is, unfortunately, only part of what makes social media dangerous.
  • The tobacco conglomerate Philip Morris International acquired the Zyn maker Swedish Match in 2022 as part of a strategic push into smokeless products, a category it projects could help drive an expected $2 billion in U.S. revenue in 2024.
  • P.M.I. is also a company that has long denied it markets tobacco products to minors despite decades of research accusing it of just that. One 2022 study alone found its brands advertising near schools and playgrounds around the globe.
  • the ’90s, when magazines ran full-page Absolut Vodka ads in different colors, which my friends and I collected and taped up on our walls next to pictures of a young Leonardo DiCaprio — until our parents tore them down. This was advertising that appealed to me as a teenager but was also visible to my parents, and — crucially — to regulators, who could point to billboards near schools or flavored vodka ads in fashion magazines and say, this is wrong.
  • Even the most committed parent today doesn’t have the same visibility into what her children are seeing online, so it is worth explaining how products like Zyn end up in social feeds
  • influencers. They aren’t traditional pitch people. Think of them more like the coolest kids on the block. They establish a following thanks to their personality, experience or expertise. They share how they’re feeling, they share what they’re thinking about, they share stuff they l
  • With ruthless efficiency, social media can deliver unlimited amounts of the content that influencers create or inspire. That makes the combination of influencers and social-media algorithms perhaps the most powerful form of advertising ever invented.
  • Videos like his operate like a meme: It’s unintelligible to the uninitiated, it’s a hilarious inside joke to those who know, and it encourages the audience to spread the message
  • Enter Tucker Carlson. Mr. Carlson, the former Fox News megastar who recently started his own subscription streaming service, has become a big Zyn influencer. He’s mentioned his love of Zyn in enough podcasts and interviews to earn the nickname Tucker CarlZyn.
  • was Max VanderAarde. You can glimpse him in a video from the event wearing a Santa hat and toasting Mr. Carlson as they each pop Zyns in their mouths. “You can call me king of Zynbabwe, or Tucker CarlZyn’s cousin,” he says in a recent TikTok. “Probably, what, moved 30 mil cans last year?”
  • Freezer Tarps, Mr. VanderAarde’s TikTok account, appears to have been removed after I asked the company about it. Left up are the large number of TikToks by the likes of @lifeofaZyn, @Zynfluencer1 and @Zyntakeover; those hashtagged to #Zynbabwe, one of Freezer Tarps’s favorite terms, have amassed more than 67 million views. So it’s worth breaking down Mr. VanderAarde’s videos.
  • All of these videos would just be jokes (in poor taste) if they were seen by adults only. They aren’t. But we can’t know for sure how many children follow the Nelk Boys or Freezer Tarps — social-media companies generally don’t release granular age-related data to the public. Mr. VanderAarde, who responded to a few of my questions via LinkedIn, said that nearly 95 percent of his followers are over the age of 18.
  • They’re incentivized to increase their following and, in turn, often their bank accounts. Young people are particularly susceptible to this kind of promotion because their relationship with influencers is akin to the intimacy of a close friend.
  • The helicopter video has already been viewed more than one million times on YouTube, and iterations of it have circulated widely on TikTok.
  • YouTube said it eventually determined that four versions of the Carlson Zyn videos were not appropriate for viewers under age 18 under its community guidelines and restricted access to them by age
  • Mr. Carlson declined to comment on the record beyond his two-word statement. The Nelk Boys didn’t respond to requests for comment. Meta declined to comment on the record. TikTok said it does not allow content that promotes tobacco or its alternatives. The company said that it has over 40,000 trust and safety experts who work to keep the platform safe and that it prevented teenagers’ accounts from viewing over two million videos globally that show the consumption of tobacco products by adults. TikTok added that in the third quarter of 2023 it proactively removed 97 percent of videos that violated its alcohol, tobacco and drugs policy.
  • Greyson Imm, the high school student in Prairie Village, Kan., points to Mr. VanderAarde as having brought Zyn “more into the mainstream.” Mr. Imm believes his interest in independent comedy on TikTok perhaps made him a target for Mr. VanderAarde’s videos. “He would create all these funny phrases or things that would make it funny and joke about it and make it relevant to us.”
  • It wasn’t long before Mr. Imm noticed Zyn blowing up among his classmates — so much so that the student, now a senior at Shawnee Mission East High School, decided to write a piece in his school newspaper about it. He conducted an Instagram poll from the newspaper’s account and found that 23 percent of the students who responded used oral nicotine pouches during school.
  • “Upper-decky lip cushions, ferda!” Mr. VanderAarde coos in what was one of his popular TikTok videos, which had been liked more than 40,000 times. The singsong audio sounds like gibberish to most people, but it’s actually a call to action. “Lip cushion” is a nickname for a nicotine pouch, and “ferda” is slang for “the guys.”
  • “I have fun posting silly content that makes fun of pop culture,” Mr. VanderAarde said to me in our LinkedIn exchange.
  • I turned to Influencity, a software program that estimates the ages of social media users by analyzing profile photos and selfies in recent posts. Influencity estimated that roughly 10 percent of the Nelk Boys’ followers on YouTube are ages 13 to 17. That’s more than 800,000 children.
  • I’ve spent the past three years studying media manipulation and memes, and what I see in Freezer Tarps’s silly content is strategy. The use of Zyn slang seems like a way to turn interest in Zyn into a meme that can be monetized through merchandise and other business opportunities.
  • Such as? Freezer Tarps sells his own pouch product, Upperdeckys, which delivers caffeine instead of nicotine and is available in flavors including cotton candy and orange creamsicle. In addition to jockeying for sponsorship, Mr. Carlson may also be trying to establish himself with a younger, more male, more online audience as his new media company begins building its subscriber base
  • This is the kind of viral word-of-mouth marketing that looks like entertainment, functions like culture and can increase sales
  • What’s particularly galling about all of this is that we as a society already agreed that peddling nicotine to kids is not OK. It is illegal to sell nicotine products to anyone under the age of 21 in all 50 states
  • numerous studies have shown that the younger people are when they try nicotine for the first time, the more likely they will become addicted to it. Nearly 90 percent of adults who smoke daily started smoking before they turned 18.
  • Decades later — even after Juul showed the power of influencers to help addict yet another generation of children — the courts, tech companies and regulators still haven’t adequately grappled with the complexities of the influencer economy.
  • Facebook, Instagram and TikTok all have guidelines that prohibit tobacco ads and sponsored, endorsed or partnership-based content that promotes tobacco products. Holding them accountable for maintaining those standards is a bigger question.
  • We need a new definition of advertising that takes into account how the internet actually works. I’d go so far as to propose that the courts broaden the definition of advertising to include all influencer promotion. For a product as dangerous as nicotine, I’d put the bar to be considered an influencer as low as 1,000 followers on a social-media account, and maybe if a video from someone with less of a following goes viral under certain legal definitions, it would become influencer promotion.
  • Laws should require tech companies to share data on what young people are seeing on social media and to prevent any content promoting age-gated products from reaching children’s feeds
  • hose efforts must go hand in hand with social media companies putting real teeth behind their efforts to verify the ages of their users. Government agencies should enforce the rules already on the books to protect children from exposure to addictive products,
  • I refuse to believe there aren’t ways to write laws and regulations that can address these difficult questions over tech company liability and free speech, that there aren’t ways to hold platforms more accountable for advertising that might endanger kids. Let’s stop treating the internet like a monster we can’t control. We built it. We foisted it upon our children. We had better try to protect them from its potential harms as best we can.
Javier E

Getting Radical About Inequality - The New York Times - 0 views

  • Pierre Bourdieu is helpful reading in the age of Trump. He was born in 1930, the son of a small-town postal worker. By the time he died in 2002, he had become perhaps the world’s most influential sociologist within the academy
  • His great subject was the struggle for power in society, especially cultural and social power. We all possess, he argued, certain forms of social capital. A person might have academic capital (the right degrees from the right schools), linguistic capital (a facility with words), cultural capital (knowledge of cuisine or music or some such) or symbolic capital (awards or markers of prestige). These are all forms of wealth you bring to the social marketplace.
  • In addition, and more important, we all possess and live within what Bourdieu called a habitus. A habitus is a body of conscious and tacit knowledge of how to travel through the world, which gives rise to mannerisms, tastes, opinions and conversational style
  • ...14 more annotations...
  • A habitus is an intuitive feel for the social game. It’s the sort of thing you get inculcated with unconsciously, by growing up in a certain sort of family or by sharing a sensibility with a certain group of friends.
  • Your habitus is what enables you to decode cultural artifacts, to feel comfortable in one setting but maybe not in another. Taste overlaps with social position; taste classifies the classifier.
  • Bourdieu used the phrase “symbolic violence” to suggest how vicious this competition can get
  • The symbolic marketplace is like the commercial marketplace; it’s a billion small bids for distinction, prestige, attention and superiority.
  • Every minute or hour, in ways we’re not even conscious of, we as individuals and members of our class are competing for dominance and respect. We seek to topple those who have higher standing than us and we seek to wall off those who are down below. Or, we seek to take one form of capital, say linguistic ability, and convert it into another kind of capital, a good job.
  • Most groups conceal their naked power grabs under a veil of intellectual or aesthetic purity
  • Every day, Bourdieu argued, we take our stores of social capital and our habitus and we compete in the symbolic marketplace. We vie as individuals and as members of our class for prestige, distinction and, above all, the power of consecration — the power to define for society what is right, what is “natural,” what is “best.”
  • People at the top, he observed, tend to adopt a reserved and understated personal style that shows they are far above the “assertive, attention-seeking strategies which expose the pretensions of the young pretenders.”
  • People at the bottom of any field, on the other hand, don’t have a lot of accomplishment to wave about, but they can use snark and sarcasm to demonstrate the superior sensibilities.
  • Trump is not much of a policy maven, but he’s a genius at the symbolic warfare Bourdieu described. He’s a genius at upending the social rules and hierarchies that the establishment classes (of both right and left) have used to maintain dominance.
  • Bourdieu didn’t argue that cultural inequality creates economic inequality, but that it widens and it legitimizes it.
  • as the information economy has become more enveloping, cultural capital and economic capital have become ever more intertwined. Individuals and classes that are good at winning the cultural competitions Bourdieu described tend to dominate the places where economic opportunity is richest; they tend to harmonize with affluent networks and do well financially.
  • the drive to create inequality is an endemic social sin. Every hour most of us, unconsciously or not, try to win subtle status points, earn cultural affirmation, develop our tastes, promote our lifestyles and advance our class. All of those microbehaviors open up social distances, which then, by the by, open up geographic and economic gaps.
  • Bourdieu radicalizes, widens and deepens one’s view of inequality. His work suggests that the responses to it are going to have to be more profound, both on a personal level — resisting the competitive, ego-driven aspects of social networking and display — and on a national one.
Javier E

The Israel-Hamas War Shows Just How Broken Social Media Has Become - The Atlantic - 0 views

  • major social platforms have grown less and less relevant in the past year. In response, some users have left for smaller competitors such as Bluesky or Mastodon. Some have simply left. The internet has never felt more dense, yet there seem to be fewer reliable avenues to find a signal in all the noise. One-stop information destinations such as Facebook or Twitter are a thing of the past. The global town square—once the aspirational destination that social-media platforms would offer to all of us—lies in ruins, its architecture choked by the vines and tangled vegetation of a wild informational jungle
  • Musk has turned X into a deepfake version of Twitter—a facsimile of the once-useful social network, altered just enough so as to be disorienting, even terrifying.
  • At the same time, Facebook’s user base began to erode, and the company’s transparency reports revealed that the most popular content circulating on the platform was little more than viral garbage—a vast wasteland of CBD promotional content and foreign tabloid clickbait.
  • ...4 more annotations...
  • What’s left, across all platforms, is fragmented. News and punditry are everywhere online, but audiences are siloed; podcasts are more popular than ever, and millions of younger people online have turned to influencers and creators on Instagram and especially TikTok as trusted sources of news.
  • Social media, especially Twitter, has sometimes been an incredible news-gathering tool; it has also been terrible and inefficient, a game of do your own research that involves batting away bullshit and parsing half truths, hyperbole, outright lies, and invaluable context from experts on the fly. Social media’s greatest strength is thus its original sin: These sites are excellent at making you feel connected and informed, frequently at the expense of actually being informed.
  • At the center of these pleas for a Twitter alternative is a feeling that a fundamental promise has been broken. In exchange for our time, our data, and even our well-being, we uploaded our most important conversations onto platforms designed for viral advertising—all under the implicit understanding that social media could provide an unparalleled window to the world.
  • What comes next is impossible to anticipate, but it’s worth considering the possibility that the centrality of social media as we’ve known it for the past 15 years has come to an end—that this particular window to the world is being slammed shut.
Javier E

Can Social Networks Do Better? We Don't Know Because They Haven't Tried - Talking Point... - 0 views

  • it’s not fair to say it’s Facebook or a Facebook problem. Facebook is just the latest media and communications medium. We hardly blame the technology of the book for spreading anti-Semitism via the notorious Protocols of the Elders of Zion
  • But of course, it’s not that simple. Social media platforms have distinct features that earlier communications media did not. The interactive nature of the media, the collection of data which is then run through algorithms and artificial intelligence creates something different.
  • All social media platforms are engineered with two basic goals: maximize the time you spend on the platform and make advertising as effective and thus as lucrative as possible. This means that social media can never be simply a common carrier, a distribution technology that has no substantial influence over the nature of the communication that travels over it.
  • ...5 more annotations...
  • it’s a substantial difference which deprives social media platforms of the kind of hands-off logic that would make it ridiculous to say phones are bad or the phone company is responsible if planning for a mass murder was carried out over the phone.
  • the Internet doesn’t ‘do’ anything more than make the distribution of information more efficient and radically lower the formal, informal and financial barriers to entry that used to stand in the way of various marginalized ideas.
  • Social media can never plead innocence like this because the platforms are designed to addict you and convince you of things.
  • If the question is: what can social media platforms do to protect against government-backed subversion campaigns like the one we saw in the 2016 campaign the best answer is, we don’t know. And we don’t know for a simple reason: they haven’t tried.
  • The point is straightforward: the mass collection of data, harnessed to modern computing power and the chance to amass unimaginable wealth has spurred vast technological innovation.
mimiterranova

GOP group 'Stop Stacey' targets Abrams ahead of expected 2022 run | TheHill - 0 views

  • GOP group 'Stop Stacey' targets Abrams ahead of expected 2022 run By Julia Manchester - 02/01/21 09:19 AM EST 863 17,083 AddThis Sharing ButtonsShare to FacebookFacebookFacebookShare to TwitterTwitterTwitter Share   Just In... Fauci: CDC will release new guidance for vaccinated Americans 'very soon' Healthcare — 2m 39s ago Veteran PAC slams Tucker Carlson for comments on female service members BriefingRoom BlogRoll — 24m 3s ago US officials debating sending millions of AstraZeneca doses awaiting clearance overseas: report Administration — 48m 11s ago Biden denounces hate, violence against Asian Americans: 'It must stop' News — 1h 29m ago Biden administration to implement testing programs in schools as part of reopening effort field
  • publican strategists aligned with Georgia Gov. Brian KempBrian KempGeorgia Senate votes to repeal no-excuse absentee voting Two Republicans can stop voter suppression Trump fires back at WSJ editorial urging GOP to move on MORE (R) on Monday launched an outside group aimed at stopping a potential 2022 gubernatorial run by Democrat Stacey Abrams.
  • "We will do whatever it takes to expose Stacey Abrams’ radical network, highlight her dangerous agenda, and ultimately defeat her — and her left-wing candidates — at the ballot box," the group's senior strategist, Jeremy Brand, said in a statement. "There is no time to waste: We must stand up, fight back, and Stop Stacey.”
  • ...1 more annotation...
  • In the two years since Abram's first run, her group Fair Fight has registered tens of thousands of voters in the Peach State and raised $100 million.
1 - 20 of 310 Next › Last »
Showing 20 items per page