Skip to main content

Home/ History Readings/ Group items tagged loss

Rss Feed Group items tagged

Javier E

Climate Reparations Are Officially Happening - The Atlantic - 0 views

  • Today, on the opening day of COP28, the United Nations climate summit in Dubai, the host country pushed through a decision that wasn’t expected to happen until the last possible minute of the two-week gathering: the creation and structure of the “loss and damage” fund, which will source money from developed countries to help pay for climate damages in developing ones. For the first time, the world has a system in place for climate reparations.
  • Nearly every country on Earth has now adopted the fund, though the text is not technically final until the end of the conference, officially slated for December 12.
  • “We have delivered history today—the first time a decision has been adopted on day one of any COP,”
  • ...12 more annotations...
  • Over much opposition from developing countries, the U.S. has insisted that the fund (technically named the Climate Impact and Response Fund) will be housed at the World Bank, where the U.S. holds a majority stake; every World Bank president has been a U.S. citizen. The U.S. also insisted that contributing to the fund not be obligatory. Sue Biniaz, the deputy special envoy for climate at the State Department, said earlier this year that she “violently opposes” arguments that developed countries have a legal obligation under the UN framework to pay into the fund.
  • The text agreed upon in Dubai on Thursday appears to strike a delicate balance: The fund will indeed be housed at the World Bank, at least for four years, but it will be run according to direction provided at the UN climate gatherings each year, and managed by a board where developed nations are designated fewer than half the seats.
  • That board’s decisions will supersede those of the World Bank “where appropriate.” Small island nations, which are threatened by extinction because of sea-level rise, will have dedicated seats. Countries that are not members of the World Bank will still be able to access the fund.
  • the U.S. remains adamant that the fund does not amount to compensation for past emissions, and it rejects any whiff of suggestions that it is liable for other countries’ climate damages.
  • Even the name “loss and damage,” with its implication of both harm and culpability, has been contentious among delegates
  • Several countries immediately announced their intended contribution to the fund. The United Arab Emirates and Germany each said they would give $100 million. The U.K. pledged more than$50 million, and Japan committed to $10 million. The U.S. said it would provide $17.5 million, a small number given its responsibility for the largest historical share of global emissions.
  • Total commitments came in on the order of hundreds of  millions, far shy of an earlier goal of $100 billion a year.
  • Other donations may continue to trickle in. But the sum is paltry considering researchers recently concluded that 55 climate-vulnerable countries have incurred $525 billion in climate-related losses from 2000 to 2019, depriving them of 20 percent of the wealth they would otherwise have
  • Still, it’s a big change in how climate catastrophe is treated by developed nations. For the first time, the countries most responsible for climate change are collectively, formally claiming some of that responsibility
  • One crucial unresolved variable is whether countries such as China and Saudi Arabia—still not treated as “developed” nations under the original UN climate framework—will acknowledge their now-outsize role in worsening climate change by contributing to the fund.
  • Another big question now will be whether the U.S. can get Congress to agree to payments to the fund, something congressional Republicans are likely to oppose.
  • Influence by oil and gas industry interests—arguably the entities truly responsible for driving climate change—now delays even public funding of global climate initiatives, he said. “The fossil-fuel industry has successfully convinced the world that loss and damage is something the taxpayer should pay for.” And yet, Whitehouse told me that the industry lobbies against efforts to use public funding this way, swaying Congress and therefore hobbling the U.S.’s ability to uphold even its meager contributions to international climate funding.
Javier E

Ozempic or Bust - The Atlantic - 0 views

  • June 2024 Issue
  • Explore
  • it is impossible to know, in the first few years of any novel intervention, whether its success will last.
  • ...77 more annotations...
  • The ordinary fixes—the kind that draw on people’s will, and require eating less and moving more—rarely have a large or lasting effect. Indeed, America itself has suffered through a long, maddening history of failed attempts to change its habits on a national scale: a yo-yo diet of well-intentioned treatments, policies, and other social interventions that only ever lead us back to where we started
  • Through it all, obesity rates keep going up; the diabetes epidemic keeps worsening.
  • The most recent miracle, for Barb as well as for the nation, has come in the form of injectable drugs. In early 2021, the Danish pharmaceutical company Novo Nordisk published a clinical trial showing remarkable results for semaglutide, now sold under the trade names Wegovy and Ozempic.
  • Patients in the study who’d had injections of the drug lost, on average, close to 15 percent of their body weight—more than had ever been achieved with any other drug in a study of that size. Wadden knew immediately that this would be “an incredible revolution in the treatment of obesity.”
  • Many more drugs are now racing through development: survodutide, pemvidutide, retatrutide. (Among specialists, that last one has produced the most excitement: An early trial found an average weight loss of 24 percent in one group of participants.
  • In the United States, an estimated 189 million adults are classified as having obesity or being overweight
  • The drugs don’t work for everyone. Their major side effects—nausea, vomiting, and diarrhea—can be too intense for many patients. Others don’t end up losing any weight
  • For the time being, just 25 percent of private insurers offer the relevant coverage, and the cost of treatment—about $1,000 a month—has been prohibitive for many Americans.
  • The drugs have already been approved not just for people with diabetes or obesity, but for anyone who has a BMI of more than 27 and an associated health condition, such as high blood pressure or cholesterol. By those criteria, more than 140 million American adults already qualify
  • if this story goes the way it’s gone for other “risk factor” drugs such as statins and antihypertensives, then the threshold for prescriptions will be lowered over time, inching further toward the weight range we now describe as “normal.”
  • How you view that prospect will depend on your attitudes about obesity, and your tolerance for risk
  • The first GLP-1 drug to receive FDA approval, exenatide, has been used as a diabetes treatment for more than 20 years. No long-term harms have been identified—but then again, that drug’s long-term effects have been studied carefully only across a span of seven years
  • the data so far look very good. “These are now being used, literally, in hundreds of thousands of people across the world,” she told me, and although some studies have suggested that GLP-1 drugs may cause inflammation of the pancreas, or even tumor growth, these concerns have not borne out.
  • adolescents are injecting newer versions of these drugs, and may continue to do so every week for 50 years or more. What might happen over all that time?
  • “All of us, in the back of our minds, always wonder, Will something show up?  ” Although no serious problems have yet emerged, she said, “you wonder, and you worry.”
  • in light of what we’ve been through, it’s hard to see what other choices still remain. For 40 years, we’ve tried to curb the spread of obesity and its related ailments, and for 40 years, we’ve failed. We don’t know how to fix the problem. We don’t even understand what’s really causing it. Now, again, we have a new approach. This time around, the fix had better work.
  • The fen-phen revolution arrived at a crucial turning point for Wadden’s field, and indeed for his career. By then he’d spent almost 15 years at the leading edge of research into dietary interventions, seeing how much weight a person might lose through careful cutting of their calories.
  • But that sort of diet science—and the diet culture that it helped support—had lately come into a state of ruin. Americans were fatter than they’d ever been, and they were giving up on losing weight. According to one industry group, the total number of dieters in the country declined by more than 25 percent from 1986 to 1991.
  • Rejecting diet culture became something of a feminist cause. “A growing number of women are joining in an anti-diet movement,” The New York Times reported in 1992. “They are forming support groups and ceasing to diet with a resolve similar to that of secretaries who 20 years ago stopped getting coffee for their bosses.
  • Now Wadden and other obesity researchers were reaching a consensus that behavioral interventions might produce in the very best scenario an average lasting weight loss of just 5 to 10 percent
  • National surveys completed in 1994 showed that the adult obesity rate had surged by more than half since 1980, while the proportion of children classified as overweight had doubled. The need for weight control in America had never seemed so great, even as the chances of achieving it were never perceived to be so small.
  • Wadden wasn’t terribly concerned, because no one in his study had reported any heart symptoms. But ultrasounds revealed that nearly one-third of them had some degree of leakage in their heart valves. His “cure for obesity” was in fact a source of harm.
  • In December 1994, the Times ran an editorial on what was understood to be a pivotal discovery: A genetic basis for obesity had finally been found. Researchers at Rockefeller University were investigating a molecule, later named leptin, that gets secreted from fat cells and travels to the brain, and that causes feelings of satiety. Lab mice with mutations in the leptin gene—importantly, a gene also found in humans—overeat until they’re three times the size of other mice. “The finding holds out the dazzling hope,”
  • In April 1996, the doctors recommended yes: Dexfenfluramine was approved—and became an instant blockbuster. Patients received prescriptions by the hundreds of thousands every month. Sketchy wellness clinics—call toll-free, 1-888-4FEN-FEN—helped meet demand. Then, as now, experts voiced concerns about access. Then, as now, they worried that people who didn’t really need the drugs were lining up to take them. By the end of the year, sales of “fen” alone had surpassed $300 million.
  • It was nothing less than an awakening, for doctors and their patients alike. Now a patient could be treated for excess weight in the same way they might be treated for diabetes or hypertension—with a drug they’d have to take for the rest of their life.
  • the article heralded a “new understanding of obesity as a chronic disease rather than a failure of willpower.”
  • News had just come out that, at the Mayo Clinic in Minnesota, two dozen women taking fen-phen—including six who were, like Barb, in their 30s—had developed cardiac conditions. A few had needed surgery, and on the operating table, doctors discovered that their heart valves were covered with a waxy plaque.
  • Americans had been prescribed regular fenfluramine since 1973, and the newer drug, dexfenfluramine, had been available in France since 1985. Experts took comfort in this history. Using language that is familiar from today’s assurances regarding semaglutide and other GLP-1 drugs, they pointed out that millions were already on the medication. “It is highly unlikely that there is anything significant in toxicity to the drug that hasn’t been picked up with this kind of experience,” an FDA official named James Bilstad would later say in a Time cover story headlined “The Hot New Diet Pill.
  • “I know I can’t get any more,” she told Williams. “I have to use up what I have. And then I don’t know what I’m going to do after that. That’s the problem—and that is what scares me to death.” Telling people to lose weight the “natural way,” she told another guest, who was suggesting that people with obesity need only go on low-carb diets, is like “asking a person with a thyroid condition to just stop their medication.”
  • She’d gone off the fen-phen and had rapidly regained weight. “The voices returned and came back in a furor I’d never heard before,” Barb later wrote on her blog. “It was as if they were so angry at being silenced for so long, they were going to tell me 19 months’ worth of what they wanted me to hear. I was forced to listen. And I ate. And I ate. And ate.”
  • For Barb, rapid weight loss has brought on a different metaphysical confusion. When she looks in the mirror, she sometimes sees her shape as it was two years ago. In certain corners of the internet, this is known as “phantom fat syndrome,” but Barb dislikes that term. She thinks it should be called “body integration syndrome,” stemming from a disconnect between your “larger-body memory” and “smaller-body reality.
  • In 2003, the U.S. surgeon general declared obesity “the terror within, a threat that is every bit as real to America as the weapons of mass destruction”; a few months later, Eric Finkelstein, an economist who studies the social costs of obesity, put out an influential paper finding that excess weight was associated with up to $79 billion in health-care spending in 1998, of which roughly half was paid by Medicare and Medicaid. (Later he’d conclude that the number had nearly doubled in a decade.
  • In 2004, Finkelstein attended an Action on Obesity summit hosted by the Mayo Clinic, at which numerous social interventions were proposed, including calorie labeling in workplace cafeterias and mandatory gym class for children of all grades.
  • he message at their core, that soda was a form of poison like tobacco, spread. In San Francisco and New York, public-service campaigns showed images of soda bottles pouring out a stream of glistening, blood-streaked fat. Michelle Obama led an effort to depict water—plain old water—as something “cool” to drink.
  • Soon, the federal government took up many of the ideas that Brownell had helped popularize. Barack Obama had promised while campaigning for president that if America’s obesity trends could be reversed, the Medicare system alone would save “a trillion dollars.” By fighting fat, he implied, his ambitious plan for health-care reform would pay for itself. Once he was in office, his administration pulled every policy lever it could.
  • Michelle Obama helped guide these efforts, working with marketing experts to develop ways of nudging kids toward better diets and pledging to eliminate “food deserts,” or neighborhoods that lacked convenient access to healthy, affordable food. She was relentless in her public messaging; she planted an organic garden at the White House and promoted her signature “Let’s Move!” campaign around the country.
  • An all-out war on soda would come to stand in for these broad efforts. Nutrition studies found that half of all Americans were drinking sugar-sweetened beverages every day, and that consumption of these accounted for one-third of the added sugar in adults’ diets. Studies turned up links between people’s soft-drink consumption and their risks for type 2 diabetes and obesity. A new strand of research hinted that “liquid calories” in particular were dangerous to health.
  • when their field lost faith in low-calorie diets as a source of lasting weight loss, the two friends went in opposite directions. Wadden looked for ways to fix a person’s chemistry, so he turned to pharmaceuticals. Brownell had come to see obesity as a product of our toxic food environment: He meant to fix the world to which a person’s chemistry responded, so he started getting into policy.
  • The social engineering worked. Slowly but surely, Americans’ lamented lifestyle began to shift. From 2001 to 2018, added-sugar intake dropped by about one-fifth among children, teens, and young adults. From the late 1970s through the early 2000s, the obesity rate among American children had roughly tripled; then, suddenly, it flattened out.
  • although the obesity rate among adults was still increasing, its climb seemed slower than before. Americans’ long-standing tendency to eat ever-bigger portions also seemed to be abating.
  • sugary drinks—liquid candy, pretty much—were always going to be a soft target for the nanny state. Fixing the food environment in deeper ways proved much harder. “The tobacco playbook pretty much only works for soda, because that’s the closest analogy we have as a food item,
  • that tobacco playbook doesn’t work to increase consumption of fruits and vegetables, he said. It doesn’t work to increase consumption of beans. It doesn’t work to make people eat more nuts or seeds or extra-virgin olive oil.
  • Careful research in the past decade has shown that many of the Obama-era social fixes did little to alter behavior or improve our health. Putting calorie labels on menus seemed to prompt at most a small decline in the amount of food people ate. Employer-based wellness programs (which are still offered by 80 percent of large companies) were shown to have zero tangible effects. Health-care spending, in general, kept going up.
  • From the mid-1990s to the mid-2000s, the proportion of adults who said they’d experienced discrimination on account of their height or weight increased by two-thirds, going up to 12 percent. Puhl and others started citing evidence that this form of discrimination wasn’t merely a source of psychic harm, but also of obesity itself. Studies found that the experience of weight discrimination is associated with overeating, and with the risk of weight gain over time.
  • obesity rates resumed their ascent. Today, 20 percent of American children have obesity. For all the policy nudges and the sensible revisions to nutrition standards, food companies remain as unfettered as they were in the 1990s, Kelly Brownell told me. “Is there anything the industry can’t do now that it was doing then?” he asked. “The answer really is no. And so we have a very predictable set of outcomes.”
  • she started to rebound. The openings into her gastric pouch—the section of her stomach that wasn’t bypassed—stretched back to something like their former size. And Barb found ways to “eat around” the surgery, as doctors say, by taking food throughout the day in smaller portions
  • Bariatric surgeries can be highly effective for some people and nearly useless for others. Long-term studies have found that 30 percent of those who receive the same procedure Barb did regain at least one-quarter of what they lost within two years of reaching their weight nadir; more than half regain that much within five years.
  • if the effects of Barb’s surgery were quickly wearing off, its side effects were not: She now had iron, calcium, and B12 deficiencies resulting from the changes to her gut. She looked into getting a revision of the surgery—a redo, more or less—but insurance wouldn’t cover it
  • She found that every health concern she brought to doctors might be taken as a referendum, in some way, on her body size. “If I stubbed my toe or whatever, they’d just say ‘Lose weight.’ ” She began to notice all the times she’d be in a waiting room and find that every chair had arms. She realized that if she was having a surgical procedure, she’d need to buy herself a plus-size gown—or else submit to being covered with a bedsheet when the nurses realized that nothing else would fit.
  • Barb grew angrier and more direct about her needs—You’ll have to find me a different chair, she started saying to receptionists. Many others shared her rage. Activists had long decried the cruel treatment of people with obesity: The National Association to Advance Fat Acceptance had existed, for example, in one form or another, since 1969; the Council on Size & Weight Discrimination had been incorporated in 1991. But in the early 2000s, the ideas behind this movement began to wend their way deeper into academia, and they soon gained some purchase with the public.
  • “Our public-health efforts to address obesity have failed,” Eric Finkelstein, the economist, told me.
  • Others attacked the very premise of a “healthy weight”: People do not have any fundamental need, they argued, morally or medically, to strive for smaller bodies as an end in itself. They called for resistance to the ideology of anti-fatness, with its profit-making arms in health care and consumer goods. The Association for Size Diversity and Health formed in 2003; a year later, dozens of scholars working on weight-related topics joined together to create the academic field of fat studies.
  • As the size-diversity movement grew, its values were taken up—or co-opted—by Big Business. Dove had recently launched its “Campaign for Real Beauty,” which included plus-size women. (Ad Age later named it the best ad campaign of the 21st century.) People started talking about “fat shaming” as something to avoid
  • By 2001, Bacon, who uses they/them pronouns, had received their Ph.D. and finished a rough draft of a book, Health at Every Size, which drew inspiration from a broader movement by that name among health-care practitioners
  • But something shifted in the ensuing years. In 2007, Bacon got a different response, and the book was published. Health at Every Size became a point of entry for a generation of young activists and, for a time, helped shape Americans’ understanding of obesity.
  • Some experts were rethinking their advice on food and diet. At UC Davis, a physiologist named Lindo Bacon who had struggled to overcome an eating disorder had been studying the effects of “intuitive eating,” which aims to promote healthy, sustainable behavior without fixating on what you weigh or how you look
  • The heightened sensitivity started showing up in survey data, too. In 2010, fewer than half of U.S. adults expressed support for giving people with obesity the same legal protections from discrimination offered to people with disabilities. In 2015, that rate had risen to three-quarters.
  • In Bacon’s view, the 2000s and 2010s were glory years. “People came together and they realized that they’re not alone, and they can start to be critical of the ideas that they’ve been taught,” Bacon told me. “We were on this marvelous path of gaining more credibility for the whole Health at Every Size movement, and more awareness.”
  • that sense of unity proved short-lived; the movement soon began to splinter. Black women have the highest rates of obesity, and disproportionately high rates of associated health conditions. Yet according to Fatima Cody Stanford, an obesity-medicine physician at Harvard Medical School, Black patients with obesity get lower-quality care than white patients with obesity.
  • That system was exactly what Bacon and the Health at Every Size movement had set out to reform. The problem, as they saw it, was not so much that Black people lacked access to obesity medicine, but that, as Bacon and the Black sociologist Sabrina Strings argued in a 2020 article, Black women have been “specifically targeted” for weight loss, which Bacon and Strings saw as a form of racism
  • But members of the fat-acceptance movement pointed out that their own most visible leaders, including Bacon, were overwhelmingly white. “White female dietitians have helped steal and monetize the body positive movement,” Marquisele Mercedes, a Black activist and public-health Ph.D. student, wrote in September 2020. “And I’m sick of it.”
  • Tensions over who had the standing to speak, and on which topics, boiled over. In 2022, following allegations that Bacon had been exploitative and condescending toward Black colleagues, the Association for Size Diversity and Health expelled them from its ranks and barred them from attending its events.
  • As the movement succumbed to in-fighting, its momentum with the public stalled. If attitudes about fatness among the general public had changed during the 2000s and 2010s, it was only to a point. The idea that some people can indeed be “fit but fat,” though backed up by research, has always been a tough sell.
  • Although Americans had become less inclined to say they valued thinness, measures of their implicit attitudes seemed fairly stable. Outside of a few cities such as San Francisco and Madison, Wisconsin, new body-size-discrimination laws were never passed.
  • In the meantime, thinness was coming back into fashion
  • In the spring of 2022, Kim Kardashian—whose “curvy” physique has been a media and popular obsession—boasted about crash-dieting in advance of the Met Gala. A year later, the model and influencer Felicity Hayward warned Vogue Business that “plus-size representation has gone backwards.” In March of this year, the singer Lizzo, whose body pride has long been central to her public persona, told The New York Times that she’s been trying to lose weight. “I’m not going to lie and say I love my body every day,” she said.
  • Among the many other dramatic effects of the GLP-1 drugs, they may well have released a store of pent-up social pressure to lose weight.
  • If ever there was a time to debate that impulse, and to question its origins and effects, it would be now. But Puhl told me that no one can even agree on which words are inoffensive. The medical field still uses obesity, as a description of a diagnosable disease. But many activists despise that phrase—some spell it with an asterisk in place of the e—and propose instead to reclaim fat.
  • Everyone seems to agree on the most important, central fact: that we should be doing everything we can to limit weight stigma. But that hasn’t been enough to stop the arguing.
  • Things feel surreal these days to just about anyone who has spent years thinking about obesity. At 71, after more than four decades in the field, Thomas Wadden now works part-time, seeing patients just a few days a week. But the arrival of the GLP-1 drugs has kept him hanging on for a few more years, he said. “It’s too much of an exciting period to leave obesity research right now.”
  • When everyone is on semaglutide or tirzepatide, will the soft-drink companies—Brownell’s nemeses for so many years—feel as if a burden has been lifted? “My guess is the food industry is probably really happy to see these drugs come along,” he said. They’ll find a way to reach the people who are taking GLP‑1s, with foods and beverages in smaller portions, maybe. At the same time, the pressures to cut back on where and how they sell their products will abate.
  • the triumph in obesity treatment only highlights the abiding mystery of why Americans are still getting fatter, even now
  • Perhaps one can lay the blame on “ultraprocessed” foods, he said. Maybe it’s a related problem with our microbiomes. Or it could be that obesity, once it takes hold within a population, tends to reproduce itself through interactions between a mother and a fetus. Others have pointed to increasing screen time, how much sleep we get, which chemicals are in the products that we use, and which pills we happen to take for our many other maladies.
  • “The GLP-1s are just a perfect example of how poorly we understand obesity,” Mozaffarian told me. “Any explanation of why they cause weight loss is all post-hoc hand-waving now, because we have no idea. We have no idea why they really work and people are losing weight.”
  • The new drugs—and the “new understanding of obesity” that they have supposedly occasioned—could end up changing people’s attitudes toward body size. But in what ways
  • When the American Medical Association declared obesity a disease in 2013, Rebecca Puhl told me, some thought “it might reduce stigma, because it was putting more emphasis on the uncontrollable factors that contribute to obesity.” Others guessed that it would do the opposite, because no one likes to be “diseased.”
  • why wasn’t there another kind of nagging voice that wouldn’t stop—a sense of worry over what the future holds? And if she wasn’t worried for herself, then what about for Meghann or for Tristan, who are barely in their 40s? Wouldn’t they be on these drugs for another 40 years, or even longer? But Barb said she wasn’t worried—not at all. “The technology is so much better now.” If any problems come up, the scientists will find solutions.
Javier E

Opinion | White Riot - The New York Times - 0 views

  • how important is the frustration among what pollsters call non-college white men at not being able to compete with those higher up on the socioeconomic ladder because of educational disadvantage?
  • How critical is declining value in marriage — or mating — markets?
  • How toxic is the combination of pessimism and anger that stems from a deterioration in standing and authority? What might engender existential despair, this sense of irretrievable loss?
  • ...40 more annotations...
  • How hard is it for any group, whether it is racial, political or ethnic, to come to terms with losing power and status? What encourages desperate behavior and a willingness to believe a pack of lies?
  • I posed these questions to a wide range of experts. This column explores their replies.
  • While most acute among those possessing high status and power, Anderson said,People in general are sensitive to status threats and to any potential losses of social standing, and they respond to those threats with stress, anxiety, anger, and sometimes even violence
  • White supremacy and frank racism are prime motivators, and they combined with other elements to fuel the insurrection: a groundswell of anger directed specifically at elites and an addictive lust for revenge against those they see as the agents of their disempowerment.
  • It is this admixture of factors that makes the insurgency that wrested control of the House and Senate so dangerous — and is likely to spark new forms of violence in the future.
  • The population of U.S. Citizens who’ve lost the most power in the past 40 years, who aren’t competing well to get into college or get high paying jobs, whose marital prospects have dimmed, and who are outraged, are those I believe were most likely to be in on the attack.
  • The terrorist attacks on 9/11, the Weatherman bombings in protest of the Vietnam War, ethnic cleansing in Bosnia, or the assassination of abortion providers, may be motivated by different ideological beliefs but nonetheless share a common theme: The people who did these things appear to be motivated by strong moral conviction. Although some argue that engaging in behaviors like these requires moral disengagement, we find instead that they require maximum moral engagement and justification.
  • “lower class individuals experience greater vigilance to threat, relative to high status individuals, leading them to perceive greater hostility in their environment.”
  • This increased vigilance, Brinke and Keltner continue, createsa bias such that relatively low socio-economic status individuals perceive the powerful as dominant and threatening — endorsing a coercive theory of power
  • there is evidence that individuals of lower social class are more cynical than those occupying higher classes, and that this cynicism is directed toward out-group members — that is, those that occupy higher classes.
  • Before Trump, many of those who became his supporters suffered from what Carol Graham, a senior fellow at Brookings, describes as pervasive “unhappiness, stress and lack of hope” without a narrative to legitimate their condition:
  • When the jobs went away, families fell apart. There was no narrative other than the classic American dream that everyone who works hard can get ahead, and the implicit correlate was that those who fall behind and are on welfare are losers, lazy, and often minorities.
  • What, however, could prompt a mob — including not only members of the Proud Boys and the Boogaloo Bois but also many seemingly ordinary Americans drawn to Trump — to break into the Capitol?
  • One possible answer: a mutated form of moral certitude based on the belief that one’s decline in social and economic status is the result of unfair, if not corrupt, decisions by others, especially by so-called elites.
  • There is evidence that many non-college white Americans who have been undergoing what psychiatrists call “involuntary subordination” or “involuntary defeat” both resent and mourn their loss of centrality and what they perceive as their growing invisibility.
  • violence is:considered to be the essence of evil. It is the prototype of immorality. But an examination of violent acts and practices across cultures and throughout history shows just the opposite. When people hurt or kill someone, they usually do it because they feel they ought to: they feel that it is morally right or even obligatory to be violent.
  • “Most violence,” Fiske and Rai contend, “is morally motivated.”
  • A key factor working in concert to aggravate the anomie and disgruntlement in many members of Trump’s white working-class base is their inability to obtain a college education, a limitation that blocks access to higher paying jobs and lowers their supposed “value” in marriage markets.
  • In their paper “Trends in Educational Assortative Marriage From 1940 to 2003,” Christine R. Schwartz and Robert D. Mare, professors of sociology at the University of Wisconsin and the University of California-Los Angeles, wrote that the “most striking” data in their research, “is the decline in odds that those with very low levels of education marry up.”
  • there isvery consistent and compelling evidence to suggest the some of what we have witnessed this past week is a reflection of the angst, anger, and refusal to accept an “America”’ in which White (Christian) Americans are losing dominance, be it political, material, and/or cultural. And, I use the term dominance here, because it is not simply a loss of status. It is a loss of power. A more racially, ethnically, religiously diverse US that is also a democracy requires White Americans to acquiesce to the interests and concerns of racial/ethnic and religious minorities.
  • In this new world, Federico argues, “promises of broad-based economic security” were replaced by a job market whereyou can have dignity, but it must be earned through market or entrepreneurial success (as the Reagan/Thatcher center-right would have it) or the meritocratic attainment of professional status (as the center-left would have it). But obviously, these are not avenues available to all, simply because society has only so many positions for captains of industry and educated professionals.
  • The result, Federico notes, is that “group consciousness is likely to emerge on the basis of education and training” and when “those with less education see themselves as being culturally very different from an educated stratum of the population that is more socially liberal and cosmopolitan, then the sense of group conflict is deepened.”
  • A major development since the end of the “Great Compression” of the 30 years or so after World War II, when there was less inequality and relatively greater job security, at least for white male workers, is that the differential rate of return on education and training is now much higher.
  • Trump, Richeson continued,leaned into the underlying White nationalist sentiments that had been on the fringe in his campaign for the presidency and made his campaign about re-centering Whiteness as what it actually means to be American and, by implication, delegitimizing claims for greater racial equity, be it in policing or any other important domain of American life.
  • Whites in the last 60 years have seen minoritized folks gain more political power, economic and educational opportunity. Even though these gains are grossly exaggerated, Whites experience them as a loss in group status.
  • all the rights revolutions — civil rights, women’s rights, gay rights — have been key to the emergence of the contemporary right wing:As the voices of women, people of color, and other traditionally marginalized communities grow louder the frame of reference from which we tell the story of American is expanding
  • The white male story is not irrelevant but it’s insufficient, and when you have a group of people that are accustomed to the spotlight see the camera lens pan away, it’s a threat to their sense of self. It’s not surprising that QAnon support started to soar in the weeks after B.L.M. QAnon offers a way for white evangelicals to place blame on (fictional) bad people instead of a broken system. It’s an organization that validates the source of Q-Anoners insecurity — irrelevance — and in its place offers a steady source of self-righteousness and acceptance.
  • “compared to other advanced countries caught up in the transition to knowledge society, the United States appears to be in a much more vulnerable position to a strong right-wing populist challenge.”
  • First, Kitschelt noted,The difference between economic winners and losers, captured by income inequality, poverty, and illiteracy rates within the dominant white ethnicity, is much greater than in most other Western countries, and there is no dense welfare state safety net to buffer the fall of people into unemployment and poverty.
  • Another key factor, Kitschelt pointed out, is thatThe decline of male status in the family is more sharply articulated than in Europe, hastened in the U.S. by economic inequality (men fall further under changing economic circumstances) and religiosity (leading to pockets of greater male resistance to the redefinition of gender roles).
  • More religious and less well-educated whites see Donald Trump as one of their own despite his being so obviously a child of privilege. He defends America as a Christian nation. He defends English as our national language. He is unashamed in stating that the loyalty of any government should be to its own citizens — both in terms of how we should deal with noncitizens here and how our foreign policy should be based on the doctrine of “America First.”
  • On top of that, in the United States.Many lines of conflict mutually reinforce each other rather than crosscut: Less educated whites tend to be more Evangelical and more racist, and they live in geographical spaces with less economic momentum.
  • for the moment the nation faces, for all intents and purposes, the makings of a civil insurgency. What makes this insurgency unusual in American history is that it is based on Trump’s false claim that he, not Joe Biden, won the presidency, that the election was stolen by malefactors in both parties, and that majorities in both branches of Congress no longer represent the true will of the people.
  • We would not have Trump as president if the Democrats had remained the party of the working class. The decline of labor unions proceeded at the same rate when Democrats were president as when Republicans were president; the same is, I believe, true of loss of manufacturing jobs as plants moved overseas.
  • President Obama, Grofman wrote,responded to the housing crisis with bailouts of the lenders and interlinked financial institutions, not of the folks losing their homes. And the stagnation of wages and income for the middle and bottom of the income distribution continued under Obama. And the various Covid aid packages, while they include payments to the unemployed, are also helping big businesses more than the small businesses that have been and will be permanently going out of business due to the lockdowns (and they include various forms of pork.
  • “white less well-educated voters didn’t desert the Democratic Party, the Democratic Party deserted them.”
  • nlike most European countries, Kitschelt wrote,The United States had a civil war over slavery in the 19th century and a continuous history of structural racism and white oligarchical rule until the 1960s, and in many aspects until the present. Europe lacks this legacy.
  • He speaks in a language that ordinary people can understand. He makes fun of the elites who look down on his supporters as a “basket of deplorables” and who think it is a good idea to defund the police who protect them and to prioritize snail darters over jobs. He appoints judges and justices who are true conservatives. He believes more in gun rights than in gay rights. He rejects political correctness and the language-police and woke ideology as un-American. And he promises to reclaim the jobs that previous presidents (of both parties) allowed to be shipped abroad. In sum, he offers a relatively coherent set of beliefs and policies that are attractive to many voters and which he has been better at seeing implemented than any previous Republican president.
  • What Trump supporters who rioted in D.C. share are the beliefs that Trump is their hero, regardless of his flaws, and that defeating Democrats is a holy war to be waged by any means necessary.
  • In the end, Grofman said,Trying to explain the violence on the Hill by only talking about what the demonstrators believe is to miss the point. They are guilty, but they wouldn’t be there were it not for the Republican politicians and the Republican attorneys general, and most of all the president, who cynically exaggerate and lie and create fake conspiracy theories and demonize the opposition. It is the enablers of the mob who truly deserve the blame and the shame.
Javier E

Opinion | Ozempic Is Repairing a Hole in Our Diets Created by Processed Foods - The New... - 0 views

  • In the United States (where I now split my time), over 70 percent of people are overweight or obese, and according to one poll, 47 percent of respondents said they were willing to pay to take the new weight-loss drugs.
  • They cause users to lose an average of 10 to 20 percent of their body weight, and clinical trials suggest that the next generation of drugs (probably available soon) leads to a 24 percent loss, on average
  • I was born in 1979, and by the time I was 21, obesity rates in the United States had more than doubled. They have skyrocketed since. The obvious question is, why? And how do these new weight-loss drugs work?
  • ...21 more annotations...
  • The answer to both lies in one word: satiety. It’s a concept that we don’t use much in everyday life but that we’ve all experienced at some point. It describes the sensation of having had enough and not wanting any more.
  • The primary reason we have gained weight at a pace unprecedented in human history is that our diets have radically changed in ways that have deeply undermined our ability to feel sated
  • The evidence is clear that the kind of food my father grew up eating quickly makes you feel full. But the kind of food I grew up eating, much of which is made in factories, often with artificial chemicals, left me feeling empty and as if I had a hole in my stomach
  • In a recent study of what American children eat, ultraprocessed food was found to make up 67 percent of their daily diet. This kind of food makes you want to eat more and more. Satiety comes late, if at all.
  • After he moved in 2000 to the United States in his 20s, he gained 30 pounds in two years. He began to wonder if the American diet has some kind of strange effect on our brains and our cravings, so he designed an experiment to test it.
  • He and his colleague Paul Johnson raised a group of rats in a cage and gave them an abundant supply of healthy, balanced rat chow made out of the kind of food rats had been eating for a very long time. The rats would eat it when they were hungry, and then they seemed to feel sated and stopped. They did not become fat.
  • then Dr. Kenny and his colleague exposed the rats to an American diet: fried bacon, Snickers bars, cheesecake and other treats. They went crazy for it. The rats would hurl themselves into the cheesecake, gorge themselves and emerge with their faces and whiskers totally slicked with it. They quickly lost almost all interest in the healthy food, and the restraint they used to show around healthy food disappeared. Within six weeks, their obesity rates soared.
  • They took all the processed food away and gave the rats their old healthy diet. Dr. Kenny was confident that they would eat more of it, proving that processed food had expanded their appetites. But something stranger happened. It was as though the rats no longer recognized healthy food as food at all, and they barely ate it. Only when they were starving did they reluctantly start to consume it again.
  • Drugs like Ozempic work precisely by making us feel full.
  • processed and ultraprocessed food create a raging hole of hunger, and these treatments can repair that hole
  • the drugs are “an artificial solution to an artificial problem.”
  • Yet we have reacted to this crisis largely caused by the food industry as if it were caused only by individual moral dereliction
  • Why do we turn our anger inward and not outward at the main cause of the crisis? And by extension, why do we seek to shame people taking Ozempic but not those who, say, take drugs to lower their blood pressure?
  • The first is the belief that obesity is a sin.
  • The second idea is that we are all in a competition when it comes to weight. Ours is a society full of people fighting against the forces in our food that are making us fatter.
  • Looked at in this way, people on Ozempic can resemble cyclists like Lance Armstrong who used performance-enhancing drugs.
  • We can’t find our way to a sane, nontoxic conversation about obesity or Ozempic until we bring these rarely spoken thoughts into the open and reckon with them
  • remember the competition isn’t between you and your neighbor who’s on weight-loss drugs. It’s between you and a food industry constantly designing new ways to undermine your satiety.
  • Reducing or reversing obesity hugely boosts health, on average: We know from years of studying bariatric surgery that it slashes the risks of cancer, heart disease and diabetes-related death. Early indications are that the new anti-obesity drugs are moving people in a similar radically healthier direction,
  • But these drugs may increase the risk for thyroid cancer.
  • Do we want these weight loss drugs to be another opportunity to tear one another down? Or do we want to realize that the food industry has profoundly altered the appetites of us all — leaving us trapped in the same cage, scrambling to find a way out?
Javier E

Steven Mnuchin's Defining Moment: Seizing Opportunity From the Financial Crisis - WSJ - 0 views

  • On a muggy morning in July 2008, hundreds of customers stood outside IndyMac Bank branches in Southern California, trying to pull their savings from the lender, which was doomed by losses on risky mortgages.
  • Steven Mnuchin didn’t know much about IndyMac as he watched the scenes on CNBC from his Midtown Manhattan office. But he immediately saw an opportunity and began figuring out how to buy the bank.
  • Regulators seized IndyMac, foreshadowing a vicious banking crisis. Six months later, Mr. Mnuchin and his investment partners acquired IndyMac with a helping hand from the U.S. government. The deal eventually earned him hundreds of millions of dollars in personal profits.
  • ...14 more annotations...
  • If confirmed by the Senate, the defining traits he will bring as the 77th Treasury secretary include a Wall Street pedigree, long relationship with Mr. Trump, and a history of moving fast to seize opportunities that might terrify others
  • IndyMac was the defining deal of Mr. Mnuchin’s career. He knew that the government needed to sell the failed bank—and he played hardball.
  • Like other Trump cabinet picks, Mr. Mnuchin has a résumé that is at odds with much of the president-elect’s populist rhetoric on the campaign trail.
  • Mr. Mnuchin is regarded within the Trump transition team’s inner circle as a skilled team player. Mr. Trump’s advisers say Mr. Mnuchin will fuse traditional Republican Party support for lower taxes and less regulation with the president-elect’s populist stances on trade and infrastructure.
  • The bank, which was renamed OneWest Bank and is now part of CIT Group Inc., is under civil investigation by the Department of Housing and Urban Development for loan-servicing practices.
  • Mr. Mnuchin, whose father spent his entire career at Goldman, came of age on Wall Street in the 1980s as the business of slicing loans into securities was booming. As a mortgage banker at Goldman, he saw up close ¾the savings-and-loan crisis and efforts by the government to wind down hundreds of insolvent financial institutions.
  • Like other partners, he earned tens of millions of dollars when Goldman became a publicly traded company in 1999. He bought a 6,500-square-foot apartment in a famous Park Avenue building. Messrs. Mnuchin and Trump were soon in the same philanthropic and social circles,
  • Mr. Mnuchin donated to the campaigns of Democrats Barack Obama,John Edwards,John Kerry and Al Gore. The only Republican presidential candidate Mr. Mnuchin gave money to was Mitt Romney in 2012.
  • It was the second-largest bank failure of the crisis, surpassed only by Washington Mutual Inc. in September 2008.
  • At the end of 2008, Mr. Mnuchin persuaded the FDIC to sell IndyMac for about $1.5 billion. The deal included IndyMac branches, deposits and assets. The FDIC also agreed to protect the buyers from the most severe losses for years. That loss-sharing arrangement turned out to be a master stroke.
  • Banks often go out of their way to avoid losses, even when borrowers are in violation of loan terms. The loss-sharing agreement took away some of the disincentives, since future losses would be borne partly by the government.
  • In July 2014, CIT agreed to buy OneWest for $3.4 billion, a bounty of more than $3 billion, including dividends. Mr. Mnuchin’s take was several hundred million dollars, according to a person familiar with the matter.
  • Before formally launching his presidential bid, Mr. Trump turned to Mr. Mnuchin for advice over dinner. Mr. Mnuchin helped write a tax-cutting plan and tried to rein in some of Mr. Trump’s populist rhetoric, including his vow to not “let Wall street get away with murder,” people familiar with the matter said.
  • Mr. Trump’s financial agenda, which Mr. Mnuchin would lead as Treasury secretary, has ignited a broad stock-market rally. CIT shares are up about 13%, increasing the value of Mr. Mnuchin’s stake by about $11 million. It is now worth more than $100 million.
Javier E

How Climate Change Is Contributing to Skyrocketing Rates of Infectious Disease | Talkin... - 0 views

  • The scientists who study how diseases emerge in a changing environment knew this moment was coming. Climate change is making outbreaks of disease more common and more dangerous.
  • Over the past few decades, the number of emerging infectious diseases that spread to people — especially coronaviruses and other respiratory illnesses believed to have come from bats and birds — has skyrocketed.
  • A new emerging disease surfaces five times a year. One study estimates that more than 3,200 strains of coronaviruses already exist among bats, awaiting an opportunity to jump to people.
  • ...46 more annotations...
  • until now, the planet’s natural defense systems were better at fighting them off.
  • Today, climate warming is demolishing those defense systems, driving a catastrophic loss in biodiversity that, when coupled with reckless deforestation and aggressive conversion of wildland for economic development, pushes farms and people closer to the wild and opens the gates for the spread of disease.
  • ignoring how climate and rapid land development were putting disease-carrying animals in a squeeze was akin to playing Russian roulette.
  • the virus is believed to have originated with the horseshoe bat, part of a genus that’s been roaming the forests of the planet for 40 million years and thrives in the remote jungles of south China, even that remains uncertain.
  • China for years and warning that swift climate and environmental change there — in both loss of biodiversity and encroachment by civilization — was going to help new viruses jump to people.
  • . Roughly 60% of new pathogens come from animals — including those pressured by diversity loss — and roughly one-third of those can be directly attributed to changes in human land use, meaning deforestation, the introduction of farming, development or resource extraction in otherwise natural settings
  • Vector-borne diseases — those carried by insects like mosquitoes and ticks and transferred in the blood of infected people — are also on the rise as warming weather and erratic precipitation vastly expand the geographic regions vulnerable to contagion.
  • Climate is even bringing old viruses back from the dead, thawing zombie contagions like the anthrax released from a frozen reindeer in 2016, which can come down from the arctic and haunt us from the past.
  • It is demonstrating in real time the enormous and undeniable power that nature has over civilization and even over its politics.
  • it also makes clear that climate policy today is indivisible from efforts to prevent new infectious outbreaks, or, as Bernstein put it, the notion that climate and health and environmental policy might not be related is “a ​dangerous delusion.”
  • The warming of the climate is one of the principal drivers of the greatest — and fastest — loss of species diversity in the history of the planet, as shifting climate patterns force species to change habitats, push them into new regions or threaten their food and water supplies
  • What’s known as biodiversity is critical because the natural variety of plants and animals lends each species greater resiliency against threat and together offers a delicately balanced safety net for natural systems
  • As diversity wanes, the balance is upset, and remaining species are both more vulnerable to human influences and, according to a landmark 2010 study in the journal Nature, more likely to pass along powerful pathogens.
  • even incremental and seemingly manageable injuries to local environments — say, the construction of a livestock farm adjacent to stressed natural forest — can add up to outsized consequences.
  • Coronaviruses like COVID-19 aren’t likely to be carried by insects — they don’t leave enough infected virus cells in the blood. But one in five other viruses transmitted from animals to people are vector-borne
  • the number of species on the planet has already dropped by 20% and that more than a million animal and plant species now face extinction.
  • Americans have been experiencing this phenomenon directly in recent years as migratory birds have become less diverse and the threat posed by West Nile encephalitis has spread. It turns out that the birds that host the disease happen to also be the tough ones that prevail amid a thinned population
  • as larger mammals suffer declines at the hands of hunters or loggers or shifting climate patterns, smaller species, including bats, rats and other rodents, are thriving, either because they are more resilient to the degraded environment or they are able to live better among people.
  • It is these small animals, the ones that manage to find food in garbage cans or build nests in the eaves of buildings, that are proving most adaptable to human interference and also happen to spread disease.
  • Warmer temperatures and higher rainfall associated with climate change — coupled with the loss of predators — are bound to make the rodent problem worse, with calamitous implications.
  • As much as weather changes can drive changes in species, so does altering the landscape for new farms and new cities. In fact, researchers attribute a full 30% of emerging contagion to what they call “land use change.”
  • As the global population surges to 10 billion over the next 35 years, and the capacity to farm food is stressed further again by the warming climate, the demand for land will only get more intense.
  • Already, more than one-third of the planet’s land surface, and three-quarters of all of its fresh water, go toward the cultivation of crops and raising of livestock. These are the places where infectious diseases spread most often.
  • The U.S. Centers for Disease Control and Prevention says that fully three-quarters of all new viruses have emerged from animals
  • Almost every major epidemic we know of over the past couple of decades — SARS, COVID-19, Ebola and Nipah virus — jumped to people from wildlife enduring extreme climate and habitat strain, and still, “we’re naive to them,” she said. “That puts us in a dangerous place.”
  • A 2008 study in the journal Nature found nearly one-third of emerging infectious diseases over the past 10 years were vector-borne, and that the jumps matched unusual changes in the climate
  • Ticks and mosquitoes now thrive in places they’d never ventured before. As tropical species move northward, they are bringing dangerous pathogens with them.
  • by 2050, disease-carrying mosquitoes will ultimately reach 500 million more people than they do today, including some 55 million more Americans.
  • In 2013, dengue fever — an affliction affecting nearly 400 million people a year, but normally associated with the poorest regions of Africa — was transmitted locally in New York for the first time.
  • “The long-term risk from dengue may be much higher than COVID,
  • only 15% of the planet’s forests remain intact. The rest have been cut down, degraded or fragmented to the point that they disrupt the natural ecosystems that depend on them.
  • it’s only a matter of time before other exotic animal-driven pathogens are driven from the forests of the global tropics to the United States or Canada or Europe because of the warming climate.
  • it will also shape how easily we get sick. According to a 2013 study in the journal PLOS Currents Influenza, warm winters were predictors of the most severe flu seasons in the following year
  • Even harsh swings from hot to cold, or sudden storms — exactly the kinds of climate-induced patterns we’re already seeing — make people more likely to get sick.
  • The chance of a flu epidemic in America’s most populated cities will increase by as much as 50% this century, and flu-related deaths in Europe could also jump by 50%.
  • Slow action on climate has made dramatic warming and large-scale environmental changes inevitable, he said, “and I think that increases in disease are going to come along with it.”
  • By late 2018, epidemiologists there were bracing for what they call “spillover,” or the failure to keep a virus locally contained as it jumped from the bats and villages of Yunnan into the wider world.
  • In late 2018, the Trump administration, as part of a sweeping effort to bring U.S. programs in China to a halt, abruptly shut down the research — and its efforts to intercept the spread of a new novel coronavirus along with it. “We got a cease and desist,” said Dennis Carroll, who founded the PREDICT program and has been instrumental in global work to address the risks from emerging viruses. By late 2019, USAID had cut the program’s global funding.
  • The loss is immense. The researchers believed they were on the cusp of a breakthrough, racing to sequence the genes of the coronaviruses they’d extracted from the horseshoe bat and to begin work on vaccines.
  • They’d campaigned for years for policymakers to fully consider what they’d learned about how land development and climate changes were driving the spread of disease, and they thought their research could literally provide governments a map to the hot spots most likely to spawn the next pandemic.
  • They also hoped the genetic material they’d collected could lead to a vaccine not just for one lethal variation of COVID, but perhaps — like a missile defense shield for the biosphere — to address a whole family of viruses at once
  • Carroll said knowledge of the virus genomes had the potential “to totally transform how we think about future biomedical interventions before there’s an emergence.
  • PREDICT’s staff and advisers have pushed the U.S. government to consider how welding public health policy with environmental and climate science could help stem the spread of contagions.
  • Since Donald Trump was elected, the group hasn’t been invited back.
  • What Daszak really wants — in addition to restored funding to continue his work — is the public and leaders to understand that it’s human behavior driving the rise in disease, just as it drives the climate crisis
  • “We turn a blind eye to the fact that our behavior is driving this,” he said. “We get cheap goods through Walmart, and then we pay for it forever through the rise in pandemics. It’s upside down.”
ethanshilling

U.S. Disaster Costs Doubled in 2020, Reflecting Costs of Climate Change - The New York ... - 0 views

  • Hurricanes, wildfires and other disasters across the United States caused $95 billion in damage last year, according to new data, almost double the amount in 2019 and the third-highest losses since 2010.
  • Those losses occurred during a year that was one of the warmest on record, a trend that makes extreme rainfall, wildfires, droughts and other environmental catastrophes more frequent and intense.
  • Topping the list was Hurricane Laura, which caused $13 billion in damage when it struck Southwestern Louisiana in late August.
  • ...8 more annotations...
  • The storms caused $43 billion in losses, almost half the total for all U.S. disasters last year.
  • The next costliest category of natural disasters was convective storms, which includes thunderstorms, tornadoes, hailstorms and derechos, and caused $40 billion in losses last year.
  • Wildfires caused another $16 billion in losses. Last year’s wildfires stood out not just because of the numbers of acres burned or houses destroyed, Munich Re said, but also because so much of that damage was outside of California
  • In California, officials have tried a series of rule changes designed to stop insurers from pulling out of fire-prone areas, leaving homeowners with few options for insurance.
  • The data also shows another worrying trend: The lack of insurance coverage in developing countries, which makes it harder for people there to recover after a disaster.
  • The single costliest disaster of 2020 was a series of floods that hit China last summer, which according to Munich Re caused $17 billion worth of damage.
  • Of the $67 billion in losses from natural disasters across Asia last year, only $3 billion, or 4.5 percent, was covered by insurance.
  • Without insurance, Mr. Rauch said, “the opportunity to recover fast after such an event is simply not there.”
Javier E

Opinion | Climate Change Is Real. Markets, Not Governments, Offer the Cure. - The New Y... - 0 views

  • For years, I saw myself not as a global-warming denier (a loaded term with its tendentious echo of Holocaust denial) but rather as an agnostic on the causes of climate change and a scoffer at the idea that it was a catastrophic threat to the future of humanity.
  • It’s not that I was unalterably opposed to the idea that, by pumping carbon dioxide into the atmosphere, modern civilization was contributing to the warming by 1 degree Celsius and the inches of sea-level rise the planet had experienced since the dawn of the industrial age. It’s that the severity of the threat seemed to me wildly exaggerated and that the proposed cures all smacked of old-fashioned statism mixed with new-age religion.
  • Hadn’t we repeatedly lived through previous alarms about other, allegedly imminent, environmental catastrophes that didn’t come to pass, like the belief, widespread in the 1970s, that overpopulation would inevitably lead to mass starvation? And if the Green Revolution had spared us from that Malthusian nightmare, why should we not have confidence that human ingenuity wouldn’t also prevent the parade of horribles that climate change was supposed to bring about?
  • ...63 more annotations...
  • I had other doubts, too. It seemed hubristic, or worse, to make multitrillion-dollar policy bets based on computer models trying to forecast climate patterns decades into the future. Climate activists kept promoting policies based on technologies that were either far from mature (solar energy) or sometimes actively harmful (biofuels).
  • Expensive efforts to curb greenhouse gas emissions in Europe and North America seemed particularly fruitless when China, India and other developing countries weren’t about to curb their own appetite for fossil fuels
  • just how fast is Greenland’s ice melting right now? Is this an emergency for our time, or is it a problem for the future?
  • His pitch was simple: The coastline we have taken for granted for thousands of years of human history changed rapidly in the past on account of natural forces — and would soon be changing rapidly and disastrously by man-made ones. A trip to Greenland, which holds one-eighth of the world’s ice on land (most of the rest is in Antarctica) would show me just how drastic those changes have been. Would I join him?
  • Greenland is about the size of Alaska and California combined and, except at its coasts, is covered by ice that in places is nearly two miles thick. Even that’s only a fraction of the ice in Antarctica, which is more than six times as large
  • Greenland’s ice also poses a nearer-term risk because it is melting faster. If all its ice were to melt, global sea levels would rise by some 24 feet. That would be more than enough to inundate hundreds of coastal cities in scores of nations, from Jakarta and Bangkok to Copenhagen and Amsterdam to Miami and New Orleans.
  • There was also a millenarian fervor that bothered me about climate activism, with its apocalyptic imagery (the Statue of Liberty underwater) and threats of doom unless we were willing to live far more frugally.
  • “We haven’t had a good positive mass balance year since the late 1990s,” he told me in a follow-on email when I asked him to explain the data for me. The losses can vary sharply by year. The annualized average over the past 30 years, he added, is 170 gigatons per year. That’s the equivalent of about 5,400 tons of ice loss per second. That “suggests that Greenland ice loss has been tracking the I.P.P.C. worse-case, highest-carbon-emission scenario.
  • The data shows unmistakably that Greenland’s ice is not in balance. It is losing far more than it is gaining.
  • scientists have been drilling ice-core samples from Greenland for decades, giving them a very good idea of climatic changes stretching back thousands of years. Better yet, a pair of satellites that detect anomalies in Earth’s gravity fields have been taking measurements of the sheet regularly for nearly 20 years, giving scientists a much more precise idea of what is happening.
  • it’s hard to forecast with any precision what that means. “Anyone who says they know what the sea level is going to be in 2100 is giving you an educated guess,” said NASA’s Willis. “The fact is, we’re seeing these big ice sheets melt for the first time in history, and we don’t really know how fast they can go.”
  • His own educated guess: “By 2100, we are probably looking at more than a foot or two and hopefully less than seven or eight feet. But we are struggling to figure out just how fast the ice sheets can melt. So the upper end of range is still not well known.”
  • On the face of it, that sounds manageable. Even if sea levels rise by eight feet, won’t the world have nearly 80 years to come to grips with the problem, during which technologies that help us mitigate the effects of climate change while adapting to its consequences are likely to make dramatic advances?
  • Won’t the world — including countries that today are poor — become far richer and thus more capable of weathering the floods, surges and superstorms?
  • The average rate at which sea level is rising around the world, he estimates, has more than tripled over the past three decades, to five millimeters a year from 1.5 millimeters. That may still seem minute, yet as the world learned during the pandemic, exponential increases have a way of hitting hard.
  • “When something is on a straight line or a smooth curve, you can plot its trajectory,” Englander said. “But sea level, like earthquakes and mudslides, is something that happens irregularly and can change rather quickly and surprise us. The point is, you can no longer predict the future by the recent past.”
  • In The Wall Street Journal’s editorial pages, where I used to work, the theoretical physicist Steven Koonin, a former under secretary for science in the Obama administration’s Energy Department, cast doubt on the threat from Thwaites in a voice that could have once been mine. He also thinks the risks associated with Greenland’s melting are less a product of human-induced global warming than of natural cycles in North Atlantic currents and temperatures, which over time have a way of regressing to the mean.
  • Even the poorest countries, while still unacceptably vulnerable, are suffering far fewer human and economic losses to climate-related disasters.
  • Another climate nonalarmist is Roger Pielke Jr., a professor of environmental studies at the University of Colorado Boulder. I call Pielke a nonalarmist rather than a skeptic because he readily acknowledges that the challenges associated with climate change, including sea-level rise, are real, serious and probably unstoppable, at least for many decades.
  • “If we have to have a problem,” he told me when I reached him by phone, “we probably want one with a slow onset that we can see coming. It’s not like an asteroid coming from space.”
  • “Since the 1940s, the impact of floods as a proportion of U.S. gross domestic product has dropped by 70 percent-plus,” Pielke said. “We see this around the world, across phenomena. The story is that fewer people are dying and we are having less damage proportional to G.D.P.”
  • “Much climate reporting today highlights short-term changes when they fit the narrative of a broken climate but then ignores or plays down changes when they don’t, often dismissing them as ‘just weather,’” he wrote in February.
  • Global warming is real and getting worse, Pielke said, yet still it’s possible that humanity will be able to adapt to, and compensate for, its effects.
  • A few years ago, I would have found voices like Koonin’s and Pielke’s persuasive. Now I’m less sure. What intervened was a pandemic.
  • That’s what I thought until the spring of 2020, when, along with everyone else, I experienced how swiftly and implacably nature can overwhelm even the richest and most technologically advanced societies. It was a lesson in the sort of intellectual humility I recommended for others
  • It was also a lesson in thinking about risk, especially those in the category known as high-impact, low-probability events that seem to be hitting us with such regularity in this century: the attacks of Sept. 11, 2001; the tsunamis of 2004 and 2011, the mass upheavals in the Arab world
  • What if the past does nothing to predict the future? What if climate risks do not evolve gradually and relatively predictably but instead suddenly soar uncontrollably? How much lead time is required to deal with something like sea-level rise? How do we weigh the risks of underreacting to climate change against the risks of overreacting to it?
  • I called Seth Klarman, one of the world’s most successful hedge-fund managers, to think through questions of risk. While he’s not an expert on climate change, he has spent decades thinking deeply about every manner of risk
  • And we will almost certainly have to do it from sources other than Russia, China, the Democratic Republic of Congo and other places that pose unacceptable strategic, environmental or humanitarian risks
  • “If you face something that is potentially existential,” he explained, “existential for nations, even for life as we know it, even if you thought the risk is, say, 5 percent, you’d want to hedge against it.”
  • “One thing we try to do,” he said, “is we buy protection when it’s really inexpensive, even when we think we may well not need it.” The forces contributing to climate change, he noted, echoing Englander, “might be irreversible sooner than the damage from climate change has become fully apparent. You can’t say it’s far off and wait when, if you had acted sooner, you might have dealt with it better and at less cost. We have to act now.”
  • In other words, an ounce of prevention is worth a pound of cure. That’s particularly true if climate change is akin to cancer — manageable or curable in its earlier stages, disastrous in its later ones.
  • As I’ve always believed, knowing there is grave risk to future generations — and expecting current ones to make immediate sacrifices for it — defies most of what we know about human nature. So I began to think more deeply about that challenge, and others.
  • For the world to achieve the net-zero goal for carbon dioxide emissions by 2050, according to the International Energy Agency, we will have to mine, by 2040, six times the current amounts of critical minerals — nickel, cobalt, copper, lithium, manganese, graphite, chromium, rare earths and other minerals and elements — needed for electric vehicles, wind turbines and solar panels.
  • The poster child for this kind of magical thinking is Germany, which undertook a historic Energiewende — “energy revolution” — only to come up short. At the turn of the century, Germany got about 85 percent of its primary energy from fossil fuels. Now it gets about 78 percent, a puny reduction, considering that the country has spent massive sums on renewables to increase the share of electricity it generates from them.
  • As in everything else in life, so too with the environment: There is no such thing as a free lunch. Whether it’s nuclear, biofuels, natural gas, hydroelectric or, yes, wind and solar, there will always be serious environmental downsides to any form of energy when used on a massive scale. A single industrial-size wind turbine, for instance, typically requires about a ton of rare earth metals as well as three metric tons of copper, which is notoriously destructive and dirty to mine.
  • no “clean energy” solution will easily liberate us from our overwhelming and, for now, inescapable dependence on fossil fuels.
  • Nobody brings the point home better than Vaclav Smil, the Canadian polymath whose most recent book, “How the World Really Works,” should be required reading for policymakers and anyone else interested in a serious discussion about potential climate solutions.
  • “I’ve talked to so many experts and seen so much evidence,” he told me over Zoom, “I’m convinced the climate is changing, and addressing climate change has become a philanthropic priority of mine.”
  • Things could turn a corner once scientists finally figure out a technical solution to the energy storage problem. Or when governments and local actors get over their NIMBYism when it comes to permitting and building a large energy grid to move electricity from Germany’s windy north to its energy-hungry south. Or when thoughtful environmental activists finally come to grips with the necessity of nuclear energy
  • Till then, even as I’ve come to accept the danger we face, I think it’s worth extending the cancer metaphor a little further: Just as cancer treatments, when they work at all, can have terrible side effects, much the same can be said of climate treatments: The gap between an accurate diagnosis and effective treatment remains dismayingly wide
  • Only when countries like Vietnam and China turned to a different model, of largely bottom-up, market-driven development, did hundreds of millions of people get lifted out of destitution.
  • the most important transformation has come in agriculture, which uses about 70 percent of the world’s freshwater supply.
  • Farmers gradually adopted sprinkler and drip irrigation systems, rather than more wasteful flood irrigation, not to conserve water but because the technology provided higher crop yields and larger profit margins.
  • Water shortages “will spur a revolutionary, aggressive approach to getting rid of flood irrigation,” said Seth Siegel, the chief sustainability officer of the Israeli AgTech company N-Drip. “Most of this innovation will be driven by free-market capitalism, with important incentives from government and NGOs.
  • meaningful environmental progress has been made through market forces. In this century, America’s carbon dioxide emissions across fuel types have fallen to well below 5,000 million metric tons per year, from a peak of about 6,000 million in 2007, even as our inflation-adjusted G.D.P. has grown by over 50 percent and total population by about 17 percent.
  • 1) Engagement with critics is vital. Insults and stridency are never good tools of persuasion, and trying to cow or censor climate skeptics into silence rarely works
  • the biggest single driver in emissions reductions from 2005 to 2017 was the switch from coal to natural gas for power generation, since gas produces roughly half the carbon dioxide as coal. This, in turn, was the result of a fracking revolution in the past decade, fiercely resisted by many environmental activists, that made the United States the world’s largest gas producer.
  • In the long run, we are likelier to make progress when we adopt partial solutions that work with the grain of human nature, not big ones that work against it
  • Renewables, particularly wind power, played a role. So did efficiency mandates.
  • The problem with our civilization isn’t overconfidence. It’s polarization, paralysis and a profound lack of trust in all institutions, including the scientific one
  • Devising effective climate policies begins with recognizing the reality of the social and political landscape in which all policy operates. Some thoughts on how we might do better:
  • They may not be directly related to climate change but can nonetheless have a positive impact on it. And they probably won’t come in the form of One Big Idea but in thousands of little ones whose cumulative impacts add up.
  • 2) Separate facts from predictions and predictions from policy. Global warming is a fact. So is the human contribution to it. So are observed increases in temperature and sea levels. So are continued increases if we continue to do more of the same. But the rate of those increases is difficult to predict even with the most sophisticated computer modeling
  • 3) Don’t allow climate to become a mainly left-of-center concern. One reason the topic of climate has become so anathema to many conservatives is that so many of the proposed solutions have the flavor, and often the price tag, of old-fashioned statism
  • 4) Be honest about the nature of the challenge. Talk of an imminent climate catastrophe is probably misleading, at least in the way most people understand “imminent.”
  • A more accurate description of the challenge might be a “potentially imminent tipping point,” meaning the worst consequences of climate change can still be far off but our ability to reverse them is drawing near. Again, the metaphor of cancer — never safe to ignore and always better to deal with at Stage 2 than at Stage 4 — can be helpful.
  • 5) Be humble about the nature of the solutions. The larger the political and financial investment in a “big fix” response to climate change on the scale of the Energiewende, the greater the loss in time, capital and (crucially) public trust when it doesn’t work as planned
  • 6) Begin solving problems our great-grandchildren will face. Start with sea-level rise
  • We can also stop providing incentives for building in flood-prone areas by raising the price of federal flood insurance to reflect the increased risk more accurately.
  • 7) Stop viewing economic growth as a problem. Industrialization may be the leading cause of climate change. But we cannot and will not reverse it through some form of deindustrialization, which would send the world into poverty and deprivation
  • 8) Get serious about the environmental trade-offs that come with clean energy. You cannot support wind farms but hinder the transmission lines needed to bring their power to the markets where they are needed.
  • 9) A problem for the future is, by its very nature, a moral one. A conservative movement that claims to care about what we owe the future has the twin responsibility of setting an example for its children and at the same time preparing for that future.
Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

A Catholic Tribute to Lord Sacks | Sohrab Ahmari | First Things - 0 views

  • The West, according to an account beloved by Catholics, rose out of a providential encounter between reason and revelation in antiquity. Though occasioned by conquest, the encounter yielded an authentic synthesis: between a Greek rationality in search of the deepest origin of reality and a Jewish God professed to be just that, the very ground of being (cf. Ex 3:14). Later, that same God identified himself even more starkly and intimately with reason (cf. Jn 1:1).
  • Tragically, the story goes on, this synthesis eventually lost its supremacy in the West, owing foremost to opponents inside the Church determined to distill a “purer” faith, unmottled by “worldly” philosophy. The result was a stingy account of reason that excluded things divine and paved the way for a narrowly scientistic rationality
  • Today, we are the victims of this dis-integration, a process of Christian de-Hellenization centuries in the making.
  • ...28 more annotations...
  • The late Rabbi Lord Jonathan Sacks, who died last month, utterly rejected this account of faith and reason. 
  • The God of the Hebrew Bible, he believed, was never the God of the Academy to begin with. The God of Abraham, Isaac, and Jacob is neither the unmoved mover nor the ground of being, but a historical God, who has put himself in dialogue and relationship with one people, the Jews.
  • little about him could be deduced by processes of reason. He is best known, rather, through the moral revolution heralded by Abrahamic faith: Judaism first, followed by Christianity and Islam.
  • De-Hellenization was thus no skin off the back of biblical faith, rightly understood. For, in this telling, the faith of the Jews, including Jesus, had always sat uneasily with the “faith” of Plato and Aristotle.
  • The synthesis between the two collapsed once its Greek metaphysical structure gave way to the battering ram of modern science.
  • The God of the Bible, Sacks contended, was lost in the bargain of Saint Paul’s ambition to spread his newfound faith to the Greco-Roman sphere. More to the point, God was lost in translation. The Greek language, with its left-to-right script, per Sacks, tends toward abstraction and universalization, whereas Hebrew is fundamentally a “right-brained” language, tending toward narrative and particularity.
  • The result was that the West received an abstract, theoretical version of a supremely narrativistic deity.
  • The Hebrew Bible, Sacks believed, has no “theory” of being itself, of natural law or of political regimes.
  • Sacks was, in truth, a pure anti-metaphysicist. In his 2011 book, The Great Partnership: Science, Religion and the Search for Meaning, he declared: “We cannot prove that life is meaningful and that God exists.”
  • he was thrilled by his atheist teachers’ demolition of the classical proofs for God, which he’d always considered a kind of cheap sleight of hand.
  • “Neither can we prove that love is better than hate, altruism than selfishness, forgiveness than the desire for revenge.” All of these statements are a matter of “interpretation,” rather than of “explanation,” and all interpretations are beyond proof or falsification.
  • The quest for ultimate meaning, he argued, falls into the same territory as “ethics, aesthetics and metaphysics”—and “in none of these three disciplines can anything of consequence be proved.”
  • Ethics, aesthetics, and metaphysics are great “repositories of human wisdom,” to be sure, but they simply don’t belong in “the same universe of discourse” as science.
  • If we distinguish the two discourses, neither need threaten the other: The one (science) explains the world by “taking things apart,” as Sacks put it; the other (religion) puts them back together via interpretation and moral formation.
  • For many Catholic intellectuals, not least Benedict XVI, restoring religion to its rightful place in human affairs involves undoing the philosophical mistakes of nominalism and of the Reformation, which the pope emeritus singled out for criticism in his much-misunderstood 2006 Regensburg Lecture.
  • We must dilate reason’s scope, Benedict thought, so that “reasoning” might again include more than merely observing phenomena and identifying their efficient material causes. Sacks did not think faith and reason could be reunited in this way.
  • But shouldn't we try? I seek ultimate meaning, yes, but I want that meaning to be true in a way that satisfies reason’s demands. And there lies the disagreement, I think, between “Regensburg Catholics,” if you will, and the various de-Hellenizing strands of contemporary religious thought.
  • despite rejecting almost in toto the Church’s account of faith and reason, Sacks nevertheless credited it for the fundamental humaneness of Western civilization.
  • More than that, the rabbi blamed the mass horrors of modernity on the narrow and arrogant rationalism that supplanted the old synthesis.
  • “Outside religion,” he wrote, there is no secure base for the unconditional source of worth that in the West has come from the idea that we are each in God’s image.
  • Though many have tried to create a secular substitute, none has ultimately succeeded. None has stood firm under pressure. That has been demonstrated four times in the modern world, when an attempt was made to create a social order on secular lines: the French Revolution, Stalinist Russia, Nazi Germany and Communist China. When there is a bonfire of sanctities, lives are lost.
  • As a student of Jewish history, Sacks knew well that the old synthesis of faith and reason wasn’t always a guarantee against unreason when it came to the treatment of Jews within Christendom. Nevertheless, he was far more wary of the merciless abstractions of the post-Enlightenment era
  • Sacks, to be clear, was no counter-Enlightenment thinker. And he paid gracious tribute to the modern scientific enterprise as an almost-miraculous instance of human cooperation with divine creativity.
  • Nevertheless, he insisted, the Enlightenment ideology, with its tendency to apply the methods of scientific inquiry to all of life, “dehumanize[d] human beings.” Its universalist “reason” detested particularity, not least the stubborn particularity of the Jewish people
  • Moreover, it targeted for demolition, in the name of humanity and reason, “the local, the church, the neighborhood, the community, even the family, the things that make us different, attached.”
  • Sacks saw similar dangers at work in today’s market liberalism: “a loss of belief in the dignity and sanctity of life”; “the loss of the politics of covenant, the idea that society is a place where we undertake collective responsibility for the common good”; “a loss of morality”; “the loss of marriage”; and the loss of “the possibility of a meaningful life.” In short, the technocratic dystopia we are stumbling into.
  • Except, Sacks rightly insisted, we don’t have to, provided we can make room in our lives and societies for “the still-small voice that the Bible tells us is the voice of God”:
  • Sacks felt that divine voice couldn’t be definitively reasoned about, certainly not in the way that, say, Benedict XVI called for. Yet the rabbi’s own public presence—supremely learned yet humble and unfailingly charitable, even to his most vicious secularist opponents—was and will remain an enduring testament to the reasonableness of faith. 
Javier E

What Is Wrong with the West's Economies? by Edmund S. Phelps | The New York Review of B... - 0 views

  • What is wrong with the economies of the West—and with economics?
  • With little or no effective policy initiative giving a lift to the less advantaged, the jarring market forces of the past four decades—mainly the slowdowns in productivity that have spread over the West and, of course, globalization, which has moved much low-wage manufacturing to Asia—have proceeded, unopposed, to drag down both employment and wage rates at the low end. The setback has cost the less advantaged not only a loss of income but also a loss of what economists call inclusion—access to jobs offering work and pay that provide self-respect.
  • The classical idea of political economy has been to let wage rates sink to whatever level the market takes them, and then provide everyone with the “safety net” of a “negative income tax,” unemployment insurance, and free food, shelter, clothing, and medical care
  • ...32 more annotations...
  • This failing in the West’s economies is also a failing of economics
  • many people have long felt the desire to do something with their lives besides consuming goods and having leisure. They desire to participate in a community in which they can interact and develop.
  • Our prevailing political economy is blind to the very concept of inclusion; it does not map out any remedy for the deficiency
  • injustice of another sort. Workers in decent jobs view the economy as unjust if they or their children have virtually no chance of climbing to a higher rung in the socioeconomic ladder
  • though the injustices in the West’s economies are egregious, they ought not to be seen as a major cause of the productivity slowdowns and globalization. (For one thing, a slowdown of productivity started in the US in the mid-1960s and the sharp loss of manufacturing jobs to poorer countries occurred much later—from the late 1970s to the early 1990s.) Deeper causes must be at work.
  • justice is not everything that people need from their economy. They need an economy that is good as well as just. And for some decades, the Western economies have fallen short of any conception of a “good economy”—an economy offering a “good life,” or a life of “richness,” as some humanists call it
  • The good life as it is popularly conceived typically involves acquiring mastery in one’s work, thus gaining for oneself better terms—or means to rewards, whether material, like wealth, or nonmaterial—an experience we may call “prospering.”
  • As humanists and philosophers have conceived it, the good life involves using one’s imagination, exercising one’s creativity, taking fascinating journeys into the unknown, and acting on the world—an experience I call “flourishing.”
  • “Money is like blood. You need it to live but it isn’t the point of life.”4
  • prospering and flourishing became prevalent in the nineteenth century when, in Europe and America, economies emerged with the dynamism to generate their own innovation.
  • today’s standard economics. This economics, despite its sophistication in some respects, makes no room for economies in which people are imagining new products and using their creativity to build them. What is most fundamentally “wrong with economics” is that it takes such an economy to be the norm—to be “as good as it gets.”
  • In nineteenth-century Britain and America, and later Germany and France, a culture of exploration, experimentation, and ultimately innovation grew out of the individualism of the Renaissance, the vitalism of the Baroque era, and the expressionism of the Romantic period.
  • What made innovating so powerful in these economies was that it was not limited to elites. It permeated society from the less advantaged parts of the population on up.
  • High-enough wages, low-enough unemployment, and wide-enough access to engaging work are necessary for a “good-enough” economy—though far from sufficient. The material possibilities of the economy must be adequate for the nonmaterial possibilities to be widespread—the satisfactions of prospering and of flourishing through adventurous, creative, and even imaginative work.
  • prospering
  • ince around 1970, or earlier in some cases, most of the continental Western European economies have come to resemble more completely the mechanical model of standard economics. Most companies are highly efficient. Households, apart from the very low-paid or unemployed, have gone on saving
  • In most of Western Europe, economic dynamism is now at lows not seen, I would judge, since the advent of dynamism in the nineteenth century. Imagining and creating new products has almost disappeared from the continent
  • The bleak levels of both unemployment and job satisfaction in Europe are testimony to its dreary economies.
  • a recent survey of household attitudes found that, in “happiness,” the median scores in Spain (54), France (51), Italy (48), and Greece (37) are all below those in the upper half of the nations labeled “emerging”—Mexico (79), Venezuela (74), Brazil (73), Argentina (66), Vietnam (64), Colombia (64), China (59), Indonesia (58), Chile (58), and Malaysia (56)
  • The US economy is not much better. Two economists, Stanley Fischer and Assar Lindbeck, wrote of a “Great Productivity Slowdown,” which they saw as beginning in the late 1960s.11 The slowdown in the growth of capital and labor combined—what is called “total factor productivity”—is star
  • What is the mechanism of the slowdown in productivity
  • The plausible explanation of the syndrome in America—the productivity slowdown and the decline of job satisfaction, among other things—is a critical loss of indigenous innovation in the established industries like traditional manufacturing and services that was not nearly offset by the innovation that flowered in a few new industries
  • hat then caused this narrowing of innovation? No single explanation is persuasive. Yet two classes of explanations have the ring of truth. One points to suppression of innovation by vested interests
  • some professions, such as those in education and medicine, have instituted regulation and licensing to curb experimentation and change, thus dampening innovation
  • established corporations—their owners and stakeholders—and entire industries, using their lobbyists, have obtained regulations and patents that make it harder for new firms to gain entry into the market and to compete with incumbents.
  • The second explanation points to a new repression of potential innovators by families and schools. As the corporatist values of control, solidarity, and protection are invoked to prohibit innovation, traditional values of conservatism and materialism are often invoked to inhibit a young person from undertaking an innovation.
  • ow might Western nations gain—or regain—widespread prospering and flourishing? Taking concrete actions will not help much without fresh thinking: people must first grasp that standard economics is not a guide to flourishing—it is a tool only for efficiency.
  • Widespread flourishing in a nation requires an economy energized by its own homegrown innovation from the grassroots on up. For such innovation a nation must possess the dynamism to imagine and create the new—economic freedoms are not sufficient. And dynamism needs to be nourished with strong human values.
  • a reform of education stands out. The problem here is not a perceived mismatch between skills taught and skills in demand
  • The problem is that young people are not taught to see the economy as a place where participants may imagine new things, where entrepreneurs may want to build them and investors may venture to back some of them. It is essential to educate young people to this image of the economy.
  • It will also be essential that high schools and colleges expose students to the human values expressed in the masterpieces of Western literature, so that young people will want to seek economies offering imaginative and creative careers. Education systems must put students in touch with the humanities in order to fuel the human desire to conceive the new and perchance to achieve innovations
  • This reorientation of general education will have to be supported by a similar reorientation of economic education.
Javier E

The Japanese Catastrophe-Posner - The Becker-Posner Blog - 0 views

  • A catastrophic risk, in the policy-relevant sense, is a very low (or unknown but believed to be low) probability of a very large loss. If the probability of loss is high, strenuous efforts will be made to avert it or mitigate its consequences. But if the probability is believed to be very low, the proper course to take will be difficult, both as a matter of sound policy and as a political matter (to which I return in the last paragraph of this comment), to determine and implement. The relevant cost is the catastrophic loss if it occurs discounted (multiplied) by the probability of its occurring. If that probability is believed to be very low, the expected cost may be reckoned to be low even if, should the loss occur, it would be catastrophic. And if the expected cost is low but the cost of prevention is high, then doing nothing to prevent the risk from materializing may be the optimal course of (in)action.
  • Politicians have limited time horizons. If the annual probability of some catastrophe is 1 percent, and a politician’s horizon is 5 years, he will be reluctant to support significant expenditures to reduce the likelihood or magnitude of the catastrophe, because to do so would involve supporting either higher taxes or a reallocation of government expenditures from services that provide immediate benefits to constituents.
Javier E

What Can Be Done About the Inefficiencies of Giving Gifts? - The Atlantic - 0 views

  • Deadweight loss is the mismatch between what a gift giver thinks a receiver wants and what the receiver actually wants. This, in Waldfogel's words, "is just the waste that arises from people making choices for other people. Normally I’ll only buy myself something that costs $50 if it’s worth at least $50 to me. When I go out and spend $50 on you though, because I don’t know what you like and what you need, I could spend $50 and buy something that would be worth nothing to you."
  • Expanding this concept to the whole economy, a conservative estimate of deadweight loss is 10 percent (that is, the average gift receiver values a gift at 90 percent of its actual value). Given that Americans are expected to spend about $600 billion on holiday gifts this year, that would put the amount of deadweight loss at $60 billion.
  • Did you get a gift this year you didn't like? Go ahead and return it or exchange it for something you do like—you'll be reducing deadweight loss.
malonema1

Trump's tariff plan puts jobs at risk - 0 views

  • President Donald Trump fulfilled a long-running campaign promise Thursday in levying tariffs on imported steel and aluminum products after a week of build-up to the announcement. The reaction to the move was divided, with major steel and aluminum players expressing support for the move, some promising job creation, while critics said the tariffs would lead to job losses in other industries.
  • Employment in the steel industry has been declining for two decades, down some 35 percent from 216,400 workers in 1998 to 139,800 in 2016. Between 2015 and 2016 alone more than 14,000 jobs were lost due to plant closings, bankruptcies and more, according to a January 2018 report from the Commerce Department. U.S. Steel and Century Aluminum have both committed to reinvesting in closed plants and hiring workers, 800 in total, as a result of the tariffs, while other major payers, including Alcoa and ArcelorMittal, have expressed their support for Trump's order.
  • But economists and researchers say that despite gains in steel jobs, losses will be seen in other sectors that will more than cancel out any new job creation. A report from global research firm Trade Partnership Worldwide finds that while some 33,000 jobs will be added within steel and aluminum due to the tariffs, the broader U.S. economy, including manufacturing and energy, will see losses of nearly 180,000 jobs, for a net loss of nearly 146,000 jobs. For every one job created as a result of the tariffs, five will be lost, the report finds. The study does not take into account any retaliation against the tariffs, only the tariffs themselves.
  • ...1 more annotation...
  • He said the jobs added in the metals industry will likely not be permanent. "As you damage the consumers of metal in the United States — the people in the supply chain who purchased the steel and aluminum — in time they would purchase less metals which would lead then to these new created jobs being temporary," Hardy said.
Javier E

Opinion | 'Medicare for All' Could Kill Two Million Jobs, and That's O.K. - The New Yor... - 0 views

  • Any significant reform would require major realignment of the health care sector, which is now the biggest employer in at least a dozen states. Most hospitals and specialists would probably lose money. Some, like the middlemen who negotiate drug prices, could be eliminated. That would mean job losses in the millions.
  • the point is to streamline for patients a Kafka-esque health care system that makes money for industry through irrational practices. After all, shouldn’t the primary goal of a health care system be delivering efficient care at a reasonable price
  • In 2012, the Harvard economists Katherine Baicker and Amitabh Chandra warned against “treating the health care system like a (wildly inefficient) jobs program.” They were rightly worried that the health care system was the primary engine of recovery from the Great Recession
  • ...6 more annotations...
  • Change could come in many guises: for example, some form of Medicare expansion, government negotiations on drug prices or enhancing the power of the Affordable Care Act. The more fundamental the reform, the more severe the economic effect.
  • The first casualties of a Medicare for all plan, said Kevin Schulman, a physician-economist at Stanford, would be the “intermediaries that add to cost, not quality.” For example, the armies of administrators, coders, billers and claims negotiators who make good middle-class salaries and have often spent years in school learning these skills.
  • Stanford researchers estimate that 5,000 community hospitals would lose more than $151 billion under a Medicare for all plan; that would translate into the loss of 860,000 to 1.5 million jobs. A Navigant study found that a typical midsize, nonprofit hospital system would have a net revenue loss of 22 percent.
  • Medicare for all would result in job losses (mostly among administrators) “somewhere in the range of two million” — about half on the insurers’ side and half employed in hospitals and doctors’ offices to argue with the former.
  • “What we can’t quantify is the effect that high health care costs have had on non-health care industries.”
  • The expense of paying for employees’ health care has depressed wages and entrepreneurship, he said. He described a textile manufacturer that moved more than 1,000 jobs out of the country because it couldn’t afford to pay for insurance for its workers. Such decisions have become common in recent years.
Javier E

How Will the Coronavirus Change Us? - The Atlantic - 0 views

  • Although medical data from the time are too scant to be definitive, its first attack is generally said to have occurred in Kansas in March 1918, as the U.S. was stepping up its involvement in the First World War.
  • Estimates of the final death toll range from 17 million to 100 million, depending on assumptions about the number of uncounted victims. Almost 700,000 people are thought to have died in the United States—as a proportion of the population, equivalent to more than 2 million people today.
  • Garthwaite matched NHIS respondents’ health conditions to the dates when their mothers were probably exposed to the flu. Mothers who got sick in the first months of pregnancy, he discovered, had babies who, 60 or 70 years later, were unusually likely to have diabetes; mothers afflicted at the end of pregnancy tended to bear children prone to kidney disease. The middle months were associated with heart disease.
  • ...23 more annotations...
  • Other studies showed different consequences. Children born during the pandemic grew into shorter, poorer, less educated adults with higher rates of physical disability than one would expect
  • the microorganisms likely killed more people than the war did. And their effects weren’t confined to European battlefields, but spread across the globe, emptying city streets and filling cemeteries on six continents.
  • Unlike the war, the flu was incomprehensible—the influenza virus wasn’t even identified until 1931. It inspired fear of immigrants and foreigners, and anger toward the politicians who played down the virus
  • killed more men than women, skewing sex ratios for years afterward. Can one be sure that the ensuing, abrupt changes in gender roles had nothing to do with the virus?
  • the accompanying flood of anti-Semitic violence. As it spread through Germany, Switzerland, France, Spain, and the Low Countries, it left behind a trail of beaten cadavers and burned homes.
  • In northern Italy, landlords tended to raise wages, which fostered the development of a middle class. In southern Italy, the nobility enacted decrees to prevent peasants from leaving to take better offers. Some historians date the separation in fortunes of the two halves of Italy—the rich north, the poor south—to these decisions.
  • When the Black Death began, the English Plantagenets were in the middle of a long, brutal campaign to conquer France. The population losses meant such a rise in the cost of infantrymen that the whole enterprise foundered. English nobles did not occupy French châteaus. Instead they stayed home and tried to force their farmhands to accept lower wages. The result, the Peasants’ Revolt of 1381, nearly toppled the English crown. King Richard II narrowly won out, but the monarchy’s ability to impose taxes, and thus its will, was permanently weakened.
  • The coronavirus is hitting societies that regarded deadly epidemics as things of the past, like whalebone corsets and bowler hats.
  • The American public has not enjoyed its surprise reentry into the world of contagion and quarantine—and this unhappiness seems likely to have consequences.
  • People sought new sources of authority, finding them through direct personal experience with the world and with God.
  • With the supply of European workers suddenly reduced and the demand for labor relatively unchanged, medieval landowners found themselves in a pickle: They could leave their grain to rot in the fields, or they could abandon all sense of right and wrong and raise wages enough to attract scarce workers
  • Within a few decades, Cohn wrote, hysteria gave way to sober observation. Medical tracts stopped referring to conjunctions of Saturn and prescribed more earthly cures: ointments, herbs, methods for lancing boils. Even priestly writings focused on the empirical. “God was not mentioned,” Cohn noted. The massacres of Jews mostly stopped.
  • the lesson seems more that humans confronting unexpected disaster engage in a contest for explanation—and the outcome can have consequences that ripple for decades or centuries.
  • Columbus’s journey to the Americas set off the worst demographic catastrophe in history
  • Somewhere between two-thirds and nine-tenths of the people in the Americas died. Many later European settlers, like my umpteen-great-grandparents, believed they were coming to a vacant wilderness. But the land was not empty; it had been emptied—a world of loss encompassed in a shift of tense.
  • Absent the diseases, it is difficult to imagine how small groups of poorly equipped Europeans at the end of very long supply chains could have survived and even thrived in the alien ecosystems of the Americas
  • “I fully support banning travel from Europe to prevent the spread of infectious disease,” the Cherokee journalist Rebecca Nagle remarked after President Trump announced his plan to do this. “I just think it’s 528 years too late.”
  • a possible legacy of Hong Kong’s success with SARS is that its citizens seem to put more faith in collective action than they used to
  • The result will be, among other things, a test of how much contemporary U.S. society values the elderly.
  • The speed with which pundits emerged to propose that the U.S. could more easily tolerate a raft of dead oldsters than an economic contraction indicates that the reservoir of appreciation for today’s elders is not as deep as it once was
  • the 2003 SARS epidemic in Hong Kong. That epidemic, which killed about 300 people, was stopped only by heroic communal efforts. (As a percentage of the population, the equivalent U.S. death toll would be about 15,000.)
  • For Native peoples, the U-shaped curve was as devastating as the sheer loss of life. As an indigenous archaeologist once put it to me, the epidemics simultaneously robbed his nation of its future and its past: the former, by killing all the children; the latter, by killing all the elders, who were its storehouses of wisdom and experience.
  • Past societies mourned the loss of collective memory caused by epidemics. Ours may not, at least at first.
aidenborst

A Year Later, Who Is Back to Work and Who Is Not? - The New York Times - 0 views

  • The economy has greatly improved from the worst months of job loss last spring, but millions of people are still out of work. And neither the initial losses nor the subsequent gains have been spread evenly.
  • As a proportion of their employment levels before the pandemic, significantly fewer Black and Hispanic women are working now than any other demographic, according to the latest government data — and women are lagging behind men across race and ethnicity.
  • Hispanic women fell into the deepest hole at the peak of the job losses, going from 12.4 million workers in February 2020, the last month of job gains before the pandemic, to 9.4 million in April — a 24 percent drop.
  • ...5 more annotations...
  • Research has shown that some of the disproportionate impact on women was driven by the need to care for children during the pandemic, a circumstance that is often not captured in the official unemployment rate, which accounts only for people actively seeking work.
  • Even among women, however, white women have not experienced the same changes in employment levels as women of color.
  • Comparing the percentage change in employment totals from a year ago is a useful benchmark for how hard the pandemic hit the American work force. But to see how the recovery is worsening inequality in the economy, it’s important to look at where different groups started from.
  • Workers on the older and younger ends of the spectrum also experienced outsize losses. Younger people, who also tend to be overrepresented in some of the most affected industries like food service, were much more likely to lose work early in the outbreak and are still among the farthest from their prepandemic employment levels. However, they have regained jobs more rapidly than older people, who may be more wary of returning to work and increasing their exposure to the coronavirus.
  • According to an analysis from the Economic Policy Institute, a left-leaning research group, workers in the lowest quartile of earners lost almost eight million jobs from 2019 to 2020, while the highest wage earners gained jobs.
anonymous

The FDA Has Approved An Obesity Drug That Helped Some People Drop Weight By 15% : NPR - 0 views

  • Regulators on Friday said a new version of a popular diabetes medicine could be sold as a weight-loss drug in the U.S.
  • In company-funded studies, participants taking Wegovy had average weight loss of 15%, about 34 pounds (15.3 kilograms).
  • Dropping even 5% of one's weight can bring health benefits, such as improved energy, blood pressure, blood sugar and cholesterol levels, but that amount often doesn't satisfy patients who are focused on weight loss, Bays said.
  • ...4 more annotations...
  • Bays said Wegovy appears far safer than earlier obesity drugs that "have gone down in flames" over safety problems.
  • The drug carries a potential risk for a type of thyroid tumor, so it shouldn't be taken by people with a personal or family history of certain thyroid and endocrine tumors. Wegovy also has a risk of depression and pancreas inflammation.
  • Wegovy (pronounced wee-GOH'-vee) is a synthesized version of a gut hormone that curbs appetite. Patients inject it weekly under their skin. Like other weight-loss drugs, it's to be used along with exercise, a healthy diet and other steps like keeping a food diary.
  • Wegovy builds on a trend in which makers of relatively new diabetes drugs test them to treat other conditions common in diabetics. For example, popular diabetes drugs Jardiance and Novo Nordisk's Victoza now have approvals for reducing risk of heart attack, stroke and death in heart patients.
Javier E

Waking Up at 4 A.M. Every Day Is the Key to Success. Or to Getting a Cold. - The New Yo... - 0 views

  • Those who slept less than six hours a night “produced cognitive performance deficits equivalent to up to 2 nights of total sleep deprivation.”
  • a group who slept only four hours a night — a common amount for those who wake up very early — for six days in a row. That group quickly developed higher levels of the stress hormone cortisol, higher blood pressure and produced half the usual amount of antibodies to a flu vaccine.
  • regularly getting four hours of sleep is the equivalent of the mental impairment of being up for 24 hours.
  • ...15 more annotations...
  • missing just one night of sleep impairs memory.
  • an impaired mind focuses “on negative information when making decisions.”
  • losing out on as little as 16 minutes a night could have serious negative impacts on job performance.
  • When we delay or speed up our internal body clock, it can have the same consequences as not getting enough sleep, a phenomenon known as advanced sleep-wake phase disorder.
  • “The reason is that our circadian rhythm tells our brain when to produce melatonin, our sleep hormone, so if you try to wake while your brain is still producing melatonin, you could feel excessive daytime sleepiness, low energy, decline in mood and cognitive impact,”
  • “There are a handful of people who can function adequately on a shorter sleep duration than the average person, but it’s very, very rare,
  • Missing even two hours here, an hour there, then having a wildly different sleep pattern over the weekend, is the gateway drug to chronic sleep deprivation.
  • you may be able to adjust your schedule
  • If you get less than seven hours a night, you can put on weight, since sleep loss can adversely impact energy intake and expenditure. That’s because, in part, the chemical that makes you feel full, leptin, is reduced, while ghrelin, the hunger hormone, increases
  • chronic sleep loss can increase the amount of free fatty acids in the blood.
  • People with sleep issues may also be at higher risk for depression and anxiety, while those disorders can also interfere with sleep.
  • The National Sleep Foundation recommends sticking to a sleep schedule. It won’t happen right away, and you’ll have to build and buy back your debt
  • Set a goal and regular bedtime, and turn your bedroom into a comfortable, dark, sleep-friendly area. That could mean blackout curtains, maybe a sleeping mask or earplugs.
  • et your body wake you up, a key to regaining natural circadian rhythm
  • Reading before bed, something Bill Gates and Arianna Huffington swear by, relaxes the mind
carolinehayter

The covid recession economically demolished minority and low income workers and barely ... - 0 views

  • The economic collapse sparked by the pandemic is triggering the most unequal recession in modern U.S. history, delivering a mild setback for those at or near the top and a depression-like blow for those at the bottom, according to a Washington Post analysis of job losses across the income spectrum.
  • While the nation overall has regained nearly half of the lost jobs, several key demographic groups have recovered more slowly, including mothers of school-age children, Black men, Black women, Hispanic men, Asian Americans, younger Americans (ages 25 to 34) and people without college degrees.
  • White women, for example, have recovered 61 percent of the jobs they lost — the most of any demographic group — while Black women have recovered only 34 percent, according to Labor Department data through August.
  • ...27 more annotations...
  • The recession’s inequality is a reflection of the coronavirus itself, which has caused more deaths in low-income communities and severely affected jobs in restaurants, hotels and entertainment venues
  • No other recession in modern history has so pummeled society’s most vulnerable. The Great Recession of 2008 and 2009 caused similar job losses across the income spectrum, as Wall Street bankers and other white-collar workers were handed pink slips alongside factory and restaurant workers.
  • The unemployed are facing new challenges. Despite President Trump’s promises of a short-lived recession, 26 million people are still receiving now-diminished unemployment benefits. The unemployed went from receiving, on average, over $900 a week in April, May, June and July, under the first federal stimulus package, to about $600 for a few weeks in late August and early September under a temporary White House executive action, to about $300 a week now on state benefits.
  • At the height of the coronavirus crisis, low-wage jobs were lost at about eight times the rate of high-wage ones, The Post found.
  • The less workers earned at their job, the more likely they were to lose it as businesses across the country closed.
  • By the end of the summer, the downturn was largely over for the wealthy — white-collar jobs had mostly rebounded, along with home values and stock prices. The shift to remote work strongly favored more-educated workers, with as many as 6 in 10 college-educated employees working from home at the outset of the crisis, compared with about 1 in 7 who have only high school diplomas.
  • Americans ages 20 to 24 suffered the greatest job losses, by far, of any age group when many businesses closed in the spring. College-age workers and recent graduates tend to be overrepresented in low-paying retail and restaurant jobs, which allow them to gain a toehold in the workforce and save money for school or training.
  • In the wake of widespread closings of schools and day-care centers, mothers are struggling to return to the workforce. Mothers of children ages 6 to 17 saw employment fall by about a third more than fathers of children the same age, and mothers are returning to work at a much slower rate. This disparity threatens years of progress for women in the labor force.
  • “The sectors most deeply affected by covid disproportionately employ women, minorities and lower-income workers.
  • What ties all of the hardest-hit groups together ― low-wage workers, Black workers, Hispanic men, those without college degrees and mothers with school-age children ― is that they are concentrated in hotels, restaurants and other hospitality jobs.
  • Most recessions, including the Great Recession, have affected manufacturing and construction jobs the most, but not this time. Nine of the 10 hardest-hit industries in the coronavirus recession are services.
  • Economists worry that many of these jobs will not return
  • Women had logged tremendous job gains in the past decade before the coronavirus hit.
  • over 30,000 restaurant and hospitality workers are unemployed in New Orleans, making it nearly impossible to find a job.
  • Ten percent of renters reported “no confidence” in their ability to pay next month’s rent, according to a U.S. Census Bureau survey conducted Sept. 2 to 14.
  • Black women are facing the largest barriers to returning to work, data shows, and have recovered only 34 percent of jobs lost in the early months of the pandemic.
  • It took until 2018 for Black women’s employment to recover from the Great Recession. Now almost all of those hard-won gains have been erased.
  • Historically, people of color and Americans with less education have been overrepresented in low-paying service jobs. Economists call it “occupational segregation.”
  • Black and Hispanic men face many of the same challenges as Black women, encountering discrimination in the workforce more often than others, and they struggled to rebound from the Great Recession.
  • While the U.S. unemployment rate has fallen to 8.4 percent, double-digit unemployment lingers in cities and states that depend heavily on tourism.
  • But with many schools and child-care centers closed and the migration to online learning, many working parents have had to become part- or full-time teachers, making it difficult to work at the same time. That burden has fallen mainly on mothers, data shows. For example, mothers of children ages 6 to 12 — the elementary school years — have recovered fewer than 45 percent of jobs lost, while employment of fathers of children the same age is 70 percent back.
  • Single parents have faced an especially hard blow.
  • One in eight households with children do not have enough to eat, according to the September survey by the Census Bureau.
  • The Fed predicts unemployment will not near pre-pandemic levels until the end of 2023. For many jobs, it may take even longer — especially those already at high risk of being replaced with software and robots.
  • “Since the 1980s, almost all employment losses in routine occupations, which are relatively easier to be automated, occurred during recessions,”
  • Many economists and business leaders are urging Congress to enact another large relief package, given the unevenness of the recovery and the long road for those who have been left behind.
  • “There are very clear winners and losers here. The losers are just being completely crushed. If the winners fail to help bring the losers along, everyone will lose,” said Mark Zandi, chief economist at Moody’s Analytics. “Things feel like they are at a breaking point from a societal perspective.”
1 - 20 of 714 Next › Last »
Showing 20 items per page