Skip to main content

Home/ History Readings/ Group items tagged funding

Rss Feed Group items tagged

Javier E

The 'Black Hole' That Sucks Up Silicon Valley's Money - The Atlantic - 0 views

  • That’s not to say that Silicon Valley’s wealthy aren’t donating their money to charity. Many, including Mark Zuckerberg, Elon Musk, and Larry Page, have signed the Giving Pledge, committing to dedicating the majority of their wealth to philanthropic causes. But much of that money is not making its way out into the community.
  • The San Francisco Bay Area has rapidly become the richest region in the country—the Census Bureau said last year that median household income was $96,777. It’s a place where $100,000 Teslas are commonplace, “raw water” goes for $37 a jug, and injecting clients with the plasma of youth —a gag on the television show Silicon Valley—is being tried by real companies for just $8,000 a pop.
  • There are many reasons for this, but one of them is likely the increasing popularity of a certain type of charitable account called a donor-advised fund. These funds allow donors to receive big tax breaks for giving money or stock, but have little transparency and no requirement that money put into them is actually spent.
  • ...23 more annotations...
  • Donor-advised funds are categorized by law as public charities, rather than private foundations, so they have no payout requirements and few disclosure requirements.
  • And wealthy residents of Silicon Valley are donating large sums to such funds
  • critics say that in part because of its structure as a warehouse of donor-advised funds, the Silicon Valley Community Foundation has not had a positive impact on the community it is meant to serve. Some people I talked to say the foundation has had little interest in spending money, because its chief executive, Emmett Carson, who was placed on paid administrative leave after the Chronicle’s report, wanted it to be known that he had created one of the biggest foundations in the country. Carson was “aggressive” about trying to raise money, but “unaggressive about suggesting what clients would do with it,”
  • “Most of us in the local area have seen our support from the foundation go down and not up,” he said.
  • The amount of money going from the Silicon Valley Community Foundation to the nine-county Bay Area actually dropped in 2017 by 46 percent, even as the amount of money under management grew by 64 percent, to $13.5 billion
  • “They got so drunk on the idea of growth that they lost track of anything smacking of mission,” he said. It did not help perceptions that the foundation opened offices in New York and San Francisco at the same time local organizations were seeing donations drop.
  • The foundation now gives her organization some grants, but they don’t come from the donor-advised funds, she told me. “I haven’t really cracked the code of how to access those donor-advised funds,” she said. Her organization had been getting between $50,000 and $100,000 a year from United Way that it no longer gets, she said,
  • Rob Reich, the co-director of the Stanford Center on Philanthropy and Civil Society, set up a donor-advised fund at the Silicon Valley Community Foundation as an experiment. He spent $5,000—the minimum amount accepted—and waited. He received almost no communication from the foundation, he told me. No emails or calls about potential nonprofits to give to, no information about whether the staff was out looking for good opportunities in the community, no data about how his money was being managed.
  • One year later, despite a booming stock market, his account was worth less than the $5,000 he had put in, and had not been used in any way in the community. His balance was lower because the foundation charges hefty fees to donors who keep their money there. “I was flabbergasted,” he told me. “I didn’t understand what I, as a donor, was getting for my fees.”
  • Though donors receive a big tax break for donating to donor-advised funds, the funds have no payout requirements, unlike private foundations, which are required to disperse 5 percent of their assets each year. With donor-advised funds, “there’s no urgency and no forced payout,”
  • he had met wealthy individuals who said they were setting up donor-advised funds so that their children could disperse the funds and learn about philanthropy—they had no intent to spend the money in their own lifetimes.
  • Fund managers also receive fees for the amount of money they have under management, which means they have little incentive to encourage people to spend the money in their accounts,
  • Transparency is also an issue. While foundations have to provide detailed information about where they give their money, donor-advised funds distributions are listed as gifts made from the entire charitable fund—like the Silicon Valley Community Foundation—rather than from individuals.
  • Donor-advised funds can also be set up anonymously, which makes it hard for nonprofits to engage with potential givers. They also don’t have websites or mission statements like private foundations do, which can make it hard for nonprofits to know what causes donors support.
  • Public charities—defined as organizations that receive a significant amount of their revenue from small donations—were saddled with less oversight, in part because Congress figured that their large number of donors would make sure they were spending their money well, Madoff said. But an attorney named Norman Sugarman, who represented the Jewish Community Federation of Cleveland, convinced the IRS to categorize a certain type of asset—charitable dollars placed in individually named accounts managed by a public charity—as donations to public, not private, foundations.
  • Donor-advised funds have been growing nationally as the amount of money made by the top 1 percent has grown: Contributions to donor-advised funds grew 15.1 percent in fiscal year 2016, according to The Chronicle of Philanthropy, while overall charitable contributions grew only 1.4 percent that year
  • Six of the top 10 philanthropies in the country last year, in terms of the amount of nongovernmental money raised, were donor-advised funds,
  • In addition, those funds with high payout rates could just be giving to another donor-advised fund, rather than to a public charity, Madoff says. One-quarter of donor-advised fund sponsors distribute less than 1 percent of their assets in a year,
  • Groups that administer donor-advised funds defend their payout rate, saying distributions from donor-advised funds are around 14 percent of assets a year. But that number can be misleading, because one donor-advised fund could give out all its money, while many more could give out none, skewing the data.
  • Donor-advised funds are especially popular in places like Silicon Valley because they provide tax advantages for donating appreciated stock, which many start-up founders have but don’t necessarily want to pay huge taxes on
  • Donors get a tax break for the value of the appreciated stock at the time they donate it, which can also spare them hefty capital-gains taxes. “Anybody with a business interest can give their business interest before it goes public and save huge amounts of taxes,”
  • Often, people give to donor-advised funds right before a public event like an initial public offering, so they can avoid the capital-gains taxes they’d otherwise have to pay, and instead receive a tax deduction. Mark Zuckerberg and Priscilla Chan gave $500 million in stock to the foundation in 2012, when Facebook held its initial public offering, and also donated $1 billion in stock in 2013
  • Wealthy donors can also donate real estate and deduct the value of real estate at the time of the donation—if they’d given to a private foundation, they’d only be able to deduct the donor’s basis value (typically the purchase price) of the real estate at the time they acquired it. The difference can be a huge amount of money in the hot market of California.
Javier E

How Index Funds May Hurt the Economy - The Atlantic - 0 views

  • Thanks to their ultralow fees and stellar long-term performance, these investment vehicles have soaked up more and more money since being developed by Vanguard’s Jack Bogle in the 1970s
  • as of 2016, investors worldwide were pulling more than $300 billion a year out of actively managed funds and pushing more than $500 billion a year into index funds. Some $11 trillion is now invested in index funds, up from $2 trillion a decade ago. And as of 2019, more money is invested in passive funds than in active funds in the United States.
  • Indexing has also gone small, very small. Although many financial institutions offer index funds to their clients, the Big Three control 80 or 90 percent of the market. The Harvard Law professor John Coates has argued that in the near future, just 12 management professionals—meaning a dozen people, not a dozen management committees or firms, mind you—will likely have “practical power over the majority of U.S. public companies.”
  • ...29 more annotations...
  • Indexing has gone big, very big. For nine in 10 companies on the S&P 500, their largest single shareholder is one of the Big Three. For many, the big indexers control 20 percent or more of their shares. Index funds now control 20 to 30 percent of the American equities market, if not more.
  • The problem is that the public markets have been cornered by a group of investment managers small enough to fit at a lunch counter, dedicated to quiescence and inertia.
  • Passively managed investment options do not just outperform actively managed ones in terms of both better returns and lower fees. They eat their lunch.
  • Let’s imagine that a decade ago you invested $100 in an index fund charging a 0.04 percent fee and $100 in a traditional mutual fund charging a 1.5 percent fee. Let’s also imagine that the index fund tracked the S&P 500, and that the mutual fund ended up returning what the S&P 500 returned. Your passively invested $100 would have turned into $356.66 in 10 years. Your traditionally invested $100 would have turned into $313.37.
  • Actively managed investment options could make up for their higher fees with higher returns. And some do, some of the time. Yet scores of industry and academic studies stretching over decades show that trying to beat the market tends to result in lower returns than just buying the market. Only a quarter of actively managed mutual funds exceeded the returns of their passively managed cousins in the decade leading up to 2019,
  • What might be good for retail investors might not be good for the financial markets, public companies, or the American economy writ large, and the passive revolution’s scope has raised all sorts of hand-wringing and red-flagging. Analysts at Bernstein have called passive investing “worse than Marxism.” The investor Michael Burry, of The Big Short fame, has called it a “bubble,” and a co-head of Goldman Sachs’s investment-management division has warned about froth too. Shortly before his death in 2019, Bogle himself warned that index funds’ dominance might not “serve the national interest.”
  • One primary concern comes from the analysts at Bernstein: “A supposedly capitalist economy where the only investment is passive is worse than either a centrally planned economy or an economy with active, market-led capital management.”
  • Active managers direct investment dollars to companies on the basis of those companies’ research-and-development prospects, human capital, regulatory outlook, and so on. They take new information and price it into a company’s stock when buying and selling shares.
  • Passive investors, by contrast, ignore annual reports and market rumors. They do nothing with trading-floor gossip. They make no attempt to research what to invest in and what to skip. Whether holding international or domestic assets, holding stocks or bonds, or using a mutual-fund structure or an ETF structure, they just mirror the market. Big U.S.-stock index funds buy big U.S. stocks just because they’re big U.S. stocks.
  • At least in a Soviet-type centrally planned economy, apparatchiks would be making some attempt to allocate resources efficiently.
  • Passive management is merely a giant phenomenon, not an all-encompassing one. Hundreds of actively managed mutual funds are still out there, as are legions of day traders, hedge funds, and private offices buying and selling and buying and selling. Stock prices still move around, sometimes dramatically, on the basis of new data and new ideas.
  • Still, passive investing may well be degrading the informational content of the markets, messing up price signals and making business decisions harder as a result.
  • When one of these commodities ends up on an index, the firms that use that commodity in their business see a 6 percent increase in costs and a 40 percent decrease in operating profits, relative to firms without exposure to the commodity, the academics found
  • Their theory is that ETF trading shifts prices in subtle ways, making it harder for businesses to know when to buy their gold and copper. Corporate executives “are being influenced by what happens in the futures market, and what happens in the futures market is being influenced by ETF trading,”
  • More broadly, the Bernstein analysts, among others, worry that index-linked investing is increasing correlation, whereby the prices of stocks, bonds, and other assets move up or down or sideways together.
  • the price fluctuations of a newly indexed stock “magically and quickly” change. A firm’s shares begin to move “more closely with its 499 new neighbors and less closely with the rest of the market. It is as if it has joined a new school of fish.”
  • A far bigger concern is that the rise of the indexers might be making American firms less competitive, through “common ownership,” in which the mega-asset managers control large stakes in multiple competitors in the same industry. The passive firms control big chunks of the airlines American, Delta, JetBlue, Southwest, and United, for instance
  • The rise of common ownership might be perverting corporate behavior in weird ways, academics argue. Think about the incentives like this: Let’s imagine that you are a major shareholder in a public widget company. We’d expect you to desire—insist, even—that the company fight for market share and profits. But now imagine that you are a major shareholder in all the important widget companies. You would no longer really care which one succeeded, particularly not if one company doing better meant another company doing worse. You’d just care about the widget sector’s corporate profits, which would go up if the widget companies quit competing with one another and started raising prices to pad their bottom line.
  • one major paper showed that common ownership of airline stocks had the effect of raising ticket prices by 3 to 7 percent.
  • A separate study showed that consumers are paying higher prices for prescription medicines because generic-drug makers have less incentive to compete with the companies making name-brand drugs.
  • Yet another study showed that common ownership is leading retail banks to charge higher prices.
  • Across firms, executive compensation seems to be more closely linked to a company’s performance when its shareholders are not invested in the company’s rivals, the study found. In other words, firms stop paying managers for performance when owned by the same people who own their rivals.
  • The market clout of the indexers raises other questions too. The actual owners of the stocks—not the index-fund managers but the people putting money into index funds—have little say over the companies they own. Vanguard, Fidelity, and State Street, not Mom and Dad, vote in shareholder elections
  • In fact, the Big Three cast roughly 25 percent of the votes in S&P 500 companies.
  • In an interview with The Wall Street Journal, the chief executive officer of State Street said he thought it was “almost inevitable, when you see this kind of concentration, that it probably will make sense to do something about it.”
  • But figuring out what the appropriate restrictions are depends on determining just what the problem with the indexers is—are they distorting price signals, raising the cost of consumer goods, posing financial systemic risk, or do they just have the market cornered? Then, what to do about it? Common ownership is not a problem the government is used to handling.
  • , thanks to the passive revolution, a broad variety and huge number of firms might have less incentive to compete. The effect on the real economy might look a lot like that of rising corporate concentration. And the two phenomena might be catalyzing one another, as index investing increases the number of mergers and makes them more lucrative.
  • In recent decades, the whole economy has gone on autopilot. Index-fund investment is hyperconcentrated. So is online retail. So are pharmaceuticals. So is broadband. Name an industry, and it is likely dominated by a handful of giant players. That has led to all sorts of deleterious downstream effects: suppressing workers’ wages, raising consumer prices, stifling innovation, stoking inequality, and suffocating business creation
  • The problem is not just the indexers. It is the public markets they reflect, where more chaos, more speculation, more risk, more innovation, and more competition are desperately needed.
Javier E

Hedge Funds Faced Choppy Waters in 2015, but Chiefs Cashed In - The New York Times - 0 views

  • Those riches came during a year of tremendous market volatility that was so bad for some Wall Street investors that the billionaire manager Daniel S. Loeb called it a “hedge fund killing field.”
  • The 25 best-paid hedge fund managers took home a collective $12.94 billion in income last year,
  • top hedge fund managers earn more than 50 times what the top executives at banks are paid.
  • ...13 more annotations...
  • Their firms do more business in some corners of the financial world than many banks, including lending to low-income homeowners and small businesses. They lobby members of Congress. And they have put large sums of money behind presidential candidates, at times pumping tens of millions of dollars into super PACs.
  • The hedge fund industry has now ballooned in size, to $2.9 trillion, from $539 billion in 2001. So, too, has the pay of the industry’s leaders.
  • When Institutional Investor first started ranking hedge fund pay 15 years ago, George Soros topped the Alpha list, earning $700 million. In 2015, Mr. Griffin, who started trading as a Harvard sophomore out of his dorm room, and James H. Simons, a former math professor, each took home $1.7 billion
  • his own personal wealth has grown exponentially, and is estimated by Forbes at $7.5 billion.
  • Mr. Griffin’s firm, Citadel, has grown from a hedge fund that managed family and pension fund money into a $25 billion firm
  • He recently made headlines when he paid $500 million for two pieces of art. In September, Mr. Griffin, 47, reportedly paid $200 million to buy several floors in a new luxury condo tower that is being built at 220 Central Park South, in Manhattan.
  • He was the biggest donor to the successful re-election campaign of Mayor Rahm Emanuel of Chicago. More recently he has poured more than $3.1 million into the failed presidential campaigns of Marco Rubio, Jeb Bush and Scott Walker, as well as the Republican National Committee
  • Citadel’s flagship Kensington and Wellington hedge funds returned 14.3 percent over 2015
  • Mr. Simons, 78, has been a major political donor of the Democrats, donating $9.2 million in 2016, including $7 million to Priorities USA Action, a super PAC supporting Hillary Clinton.
  • Among 2015’s top hedge fund earners are five men who actually lost money for some investors last year but still made handsome profits because their firms are so big
  • For many managers, collecting large pay, even when performance was not tops, has become a side effect of growing bigger. Advertisement Continue reading the main story “Once a hedge fund gets to be large enough to produce incredibly outsized remuneration, the hardest part of due diligence is determining whether the investment process is affected,
  • “Is the goal to continue to make money in a risky environment or is the goal to preserve assets on which you collect fees?”
  • Michael Platt, the founder of BlueCrest Capital Management, took home $260 million, according to Alpha. It was a difficult year for his firm, once one of the biggest hedge funds in Europe with $37 billion in investor money. He lost investors in his flagship fund 0.63 percent over the year and then told them he was throwing in the towel.
Javier E

Climate Reparations Are Officially Happening - The Atlantic - 0 views

  • Today, on the opening day of COP28, the United Nations climate summit in Dubai, the host country pushed through a decision that wasn’t expected to happen until the last possible minute of the two-week gathering: the creation and structure of the “loss and damage” fund, which will source money from developed countries to help pay for climate damages in developing ones. For the first time, the world has a system in place for climate reparations.
  • Nearly every country on Earth has now adopted the fund, though the text is not technically final until the end of the conference, officially slated for December 12.
  • “We have delivered history today—the first time a decision has been adopted on day one of any COP,”
  • ...12 more annotations...
  • Over much opposition from developing countries, the U.S. has insisted that the fund (technically named the Climate Impact and Response Fund) will be housed at the World Bank, where the U.S. holds a majority stake; every World Bank president has been a U.S. citizen. The U.S. also insisted that contributing to the fund not be obligatory. Sue Biniaz, the deputy special envoy for climate at the State Department, said earlier this year that she “violently opposes” arguments that developed countries have a legal obligation under the UN framework to pay into the fund.
  • The text agreed upon in Dubai on Thursday appears to strike a delicate balance: The fund will indeed be housed at the World Bank, at least for four years, but it will be run according to direction provided at the UN climate gatherings each year, and managed by a board where developed nations are designated fewer than half the seats.
  • That board’s decisions will supersede those of the World Bank “where appropriate.” Small island nations, which are threatened by extinction because of sea-level rise, will have dedicated seats. Countries that are not members of the World Bank will still be able to access the fund.
  • the U.S. remains adamant that the fund does not amount to compensation for past emissions, and it rejects any whiff of suggestions that it is liable for other countries’ climate damages.
  • Even the name “loss and damage,” with its implication of both harm and culpability, has been contentious among delegates
  • Several countries immediately announced their intended contribution to the fund. The United Arab Emirates and Germany each said they would give $100 million. The U.K. pledged more than$50 million, and Japan committed to $10 million. The U.S. said it would provide $17.5 million, a small number given its responsibility for the largest historical share of global emissions.
  • Total commitments came in on the order of hundreds of  millions, far shy of an earlier goal of $100 billion a year.
  • Other donations may continue to trickle in. But the sum is paltry considering researchers recently concluded that 55 climate-vulnerable countries have incurred $525 billion in climate-related losses from 2000 to 2019, depriving them of 20 percent of the wealth they would otherwise have
  • Still, it’s a big change in how climate catastrophe is treated by developed nations. For the first time, the countries most responsible for climate change are collectively, formally claiming some of that responsibility
  • One crucial unresolved variable is whether countries such as China and Saudi Arabia—still not treated as “developed” nations under the original UN climate framework—will acknowledge their now-outsize role in worsening climate change by contributing to the fund.
  • Another big question now will be whether the U.S. can get Congress to agree to payments to the fund, something congressional Republicans are likely to oppose.
  • Influence by oil and gas industry interests—arguably the entities truly responsible for driving climate change—now delays even public funding of global climate initiatives, he said. “The fossil-fuel industry has successfully convinced the world that loss and damage is something the taxpayer should pay for.” And yet, Whitehouse told me that the industry lobbies against efforts to use public funding this way, swaying Congress and therefore hobbling the U.S.’s ability to uphold even its meager contributions to international climate funding.
Javier E

Americans Aren't Saving Enough for Retirement, but One Change Could Help - NYTimes.com - 0 views

  • On average, a typical working family in the anteroom of retirement — headed by somebody 55 to 64 years old — has only about $104,000 in retirement savings
  • more than half of all American households will not have enough retirement income to maintain the living standards they were accustomed to before retirement,
  • 83 percent of baby boomers and Generation Xers in the bottom fourth of the income distribution will eventually run short of money.
  • ...16 more annotations...
  • More than a quarter of those with incomes between the middle of the income distribution and the 75th percentile will probably run short.
  • The standard prescription is that Americans should put more money aside in investments. The recommendation, however, glosses over a critical driver of unpreparedness: Wall Street is bleeding savers dry.
  • “A greater part of the problem is the failure of investors to earn their fair share of market returns.”
  • His observation suggests a different policy prescription: shoring up Americans’ retirement requires, first of all, aligning the interests of investment advisers and their clients.
  • Actively managed mutual funds, in which many workers invest their retirement savings, are enormously costly.
  • Altogether, costs add up to 2.27 percent per year, Mr. Bogle estimates.
  • By contrast, a passive index fund, like Vanguard’s Total Stock Market Index Fund, costs merely 0.06 percent a year in all.
  • Assuming an annual market return of 7 percent, he says, a 30-year-old worker who made $30,000 a year and received a 3 percent annual raise could retire at age 70 with $927,000 in the pot by saving 10 percent of her wages every year in a passive index fund. (Such a nest egg, at the standard withdrawal rate of 4 percent, would generate an inflation-adjusted $37,000 a year more or less indefinitely.) If she put it in a typical actively managed fund, she would end up with only $561,000.
  • In 1979, almost two in five private sector workers had a defined-benefit pension that would pay out a check until they died. Today only 14 percent do. Almost one in three, by contrast, must make do with a retirement savings account alone to supplement their Social Security check.
  • nobody was paying attention to the safeguards that might be needed when corporate retirement funds managed by sophisticated professionals were replaced by individual 401(k)s and Individual Retirement Accounts.
  • “Wall Street makes no money on low-cost index funds,” said David F. Swensen, who runs the investment portfolio for Yale. “That is the problem.”
  • Harvard and colleagues from M.I.T. and the University of Hamburg sent “mystery shoppers” to visit financial advisers. They found that advisers mostly recommended investment strategies that fit their own financial interests. They reinforced their clients’ misguided biases, encouraging them to chase returns and advising against low-cost options like low-fee index funds.
  • For all their flaws, 401(k) plans have a fiduciary responsibility to act in participants’ best interest. Managers of I.R.A.s, by contrast, are not legally bound to put their clients’ interests first. They must offer “suitable” products — a much squishier standard.
  • The White House’s Council of Economic Advisers argues that “conflicted advice” by advisers who get payments from the funds they recommend reduces the annual returns to investment by 1 percentage point, a more modest penalty than Mr. Bogle’s analysis
  • In 2010, the Labor Department proposed imposing fiduciary responsibility on I.R.A. advisers. The resistance from Wall Street was so fierce that the Obama administration was forced to back down. Last month, the administration tried again.
  • Unlike regulations in Canada and some Western European countries, which have essentially banned kickbacks from funds to investment advisers, the Obama administration’s proposed rule does not directly attack conflicts of interest.
Javier E

A Deadly Coronavirus Was Inevitable. Why Was No One Ready? - WSJ - 0 views

  • When Disease X actually arrived, as Covid-19, governments, businesses, public-health officials and citizens soon found themselves in a state of chaos, battling an invisible enemy with few resources and little understanding—despite years of work that outlined almost exactly what the virus would look like and how to mitigate its impact.
  • Governments had ignored clear warnings and underfunded pandemic preparedness. They mostly reacted to outbreaks, instead of viewing new infectious diseases as major threats to national security. And they never developed a strong international system for managing epidemics, even though researchers said the nature of travel and trade would spread infection across borders.
  • Underlying it all was a failure that stretches back decades. Most everyone knew such an outcome was possible. And yet no one was prepared.
  • ...54 more annotations...
  • Last year, a Chinese scientist he worked with published a specific forecast: “It is highly likely that future SARS- or MERS-like coronavirus outbreaks will originate from bats, and there is an increased probability that this will occur in China.”
  • Humans today are exposed to more deadly new pathogens than ever. They typically come from animals, as global travel, trade and economic development, such as meat production and deforestation, push people, livestock and wildlife closer together
  • Scientists knew infectious disease outbreaks were becoming more common, with 2010 having more than six times the outbreaks of pathogens from animal origins than in 1980, according to data in a study by Brown University researchers.
  • Yet plenty was left undone, in areas including funding, early-warning systems, the role of the WHO and coordination with China. A big chunk of U.S. funding went toward protecting Americans against a bioterror attack. Government funding for pandemics has come largely in emergency, one-time packages to stop an ongoing outbreak.
  • She said a better solution would be to fund public health more like national defense, with much more guaranteed money, year in, year out.
  • “Will there be another human influenza pandemic?” Dr. Webster asked in a paper presented at an NIH meeting in 1995. “The certainty is that there will be.”
  • Experts including Dr. Webster were particularly concerned about the potential for spillover in southern China, where large, densely populated cities were expanding rapidly into forests and agricultural lands, bringing people into closer contact with animals. Two of the three influenza pandemics of the 20th century are thought to have originated in China.
  • Dr. Webster and others warned it could re-emerge or mutate into something more contagious. With U.S. funding, he set up an animal influenza surveillance center in Hong Kong. The WHO, which hadn’t planned for pandemics before, started compiling protocols for a large-scale outbreak, including contingency plans for vaccines.
  • At a dinner back in the U.S., he remembers one guest saying, “Oh, you really needed to have someone in the U.S. to be impacted to really galvanize the government.”
  • That “drove home the reality in my own mind of globalization,” said Dr. Fukuda. SARS showed that viruses can crisscross the globe by plane in hours, making a local epidemic much more dangerous.
  • The WHO’s director-general, Gro Harlem Brundtland, publicly criticized China. The government under new leaders reversed course. It implemented draconian quarantines and sanitized cities, including a reported 80 million people enlisted to clean streets in Guangdong.
  • By May 2003, the number of new SARS cases was dwindling. It infected around 8,000 people world-wide, killing nearly 10%.
  • After SARS, China expanded epidemiologist training and increased budgets for new laboratories. It started working more closely in public health with the U.S., the world’s leader. The U.S. CDC opened an office in Beijing to share expertise and make sure coverups never happened again. U.S. CDC officials visiting a new China CDC campus planted a friendship tree.
  • In Washington in 2005, a powerful player started driving U.S. efforts to become more prepared. President George W. Bush had read author John M. Barry’s “The Great Influenza,” a history of the 1918 flu pandemic
  • Mr. Bush leaned toward the group of 10 or so officials and said, “I want to see a plan,” according to Dr. Venkayya. “He had been asking questions and not getting answers,” recalled Dr. Venkayya, now president of Takeda Pharmaceutical Co. ’s global vaccine business unit. “He wanted people to see this as a national threat.”
  • Mr. Bush launched the strategy in November, and Congress approved $6.1 billion in one-time funding.
  • The CDC began exercises enacting pandemic scenarios and expanded research. The government created the Biomedical Advanced Research and Development Authority to fund companies to develop diagnostics, drugs and vaccines.
  • A team of researchers also dug into archives of the 1918 pandemic to develop guidelines for mitigating the spread when vaccines aren’t available. The tactics included social distancing, canceling large public gatherings and closing schools—steps adopted this year when Covid-19 struck, though at the time they didn’t include wide-scale lockdowns.
  • A year after the plan was released, a progress report called for more real-time disease surveillance and preparations for a medical surge to care for large numbers of patients, and stressed strong, coordinated federal planning.
  • A European vaccine makers’ association said its members had spent around $4 billion on pandemic vaccine research and manufacturing adjustments by 2008.
  • The $6.1 billion Congress appropriated for Mr. Bush’s pandemic plan was spent mostly to make and stockpile medicines and flu vaccines and to train public-health department staff. The money wasn’t renewed. “The reality is that for any leader it’s really hard to maintain a focus on low-probability high-consequence events, particularly in the health arena,” Dr. Venkayya said.
  • In the U.S., President Barack Obama’s administration put Mr. Bush’s new plan into action for the first time. By mid-June, swine flu, as it was dubbed, had jumped to 74 countries. The WHO officially labeled it a pandemic, despite some evidence suggesting the sickness was pretty mild in most people.
  • That put in motion a host of measures, including some “sleeping” contracts with pharmaceutical companies to begin vaccine manufacturing—contracts that countries like the United Kingdom had negotiated ahead of time so they wouldn’t have to scramble during an outbreak.
  • In August, a panel of scientific advisers to Mr. Obama published a scenario in which as many as 120 million Americans, 40% of the population, could be infected that year, and up to 90,000 people could die.
  • H1N1 turned out to be much milder. Although it eventually infected more than 60 million Americans, it killed less than 13,000. In Europe, fewer than 5,000 deaths were reported.
  • The WHO came under fire for labeling the outbreak a pandemic too soon. European lawmakers, health professionals and others suggested the organization may have been pressured by the pharmaceutical industry.
  • France ordered 94 million doses, but had logged only 1,334 serious cases and 312 deaths as of April 2010. It managed to cancel 50 million doses and sell some to other countries, but it was still stuck with a €365 million tab, or about $520 million at the time, and 25 million extra doses.
  • The WHO had raised scares for SARS, mad-cow disease, bird flu and now swine flu, and it had been wrong each time, said Paul Flynn, a member of the Council of Europe’s Parliamentary Assembly and a British lawmaker, at a 2010 health committee hearing in Strasbourg.
  • Ultimately, an investigation by the council’s committee accused the WHO and public-health officials of jumping the gun, wasting money, provoking “unjustified fear” among Europeans and creating risks through vaccines and medications that might not have been sufficiently tested.
  • “I thought you might have uttered a word of regret or an apology,” Mr. Flynn told Dr. Fukuda, who as a representative of the WHO had been called to testify.
  • Back in Washington, scientist Dennis Carroll, at the U.S. Agency for International Development, was also convinced that flu wasn’t the only major pandemic threat. In early 2008, Dr. Carroll was intrigued by Dr. Daszak’s newly published research that said viruses from wildlife were a growing threat, and would emerge most frequently where development was bringing people closer to animals.
  • If most of these viruses spilled over to humans in just a few places, including southern China, USAID could more easily fund an early warning system.
  • “You didn’t have to look everywhere,” he said he realized. “You could target certain places.” He launched a new USAID effort focused on emerging pandemic threats. One program called Predict had funding of about $20 million a year to identify pathogens in wildlife that have the potential to infect people.
  • Drs. Daszak, Shi and Wang, supported by funds from Predict, the NIH and China, shifted their focus to Yunnan, a relatively wild and mountainous province that borders Myanmar, Laos and Vietnam.
  • One key discovery: a coronavirus resembling SARS that lab tests showed could infect human cells. It was the first proof that SARS-like coronaviruses circulating in southern China could hop from bats to people. The scientists warned of their findings in a study published in the journal Nature in 2013.
  • Evidence grew that showed people in the area were being exposed to coronaviruses. One survey turned up hundreds of villagers who said they recently showed symptoms such as trouble breathing and a fever, suggesting a possible viral infection.
  • Over the next several years, governments in the U.S. and elsewhere found themselves constantly on the defensive from global viral outbreaks. Time and again, preparedness plans proved insufficient. One, which started sickening people in Saudi Arabia and nearby
  • On a weekend morning in January 2013, more than a dozen senior Obama administration officials met in a basement family room in the suburban home of a senior National Security Council official. They were brainstorming how to help other countries upgrade their epidemic response capabilities, fueled by bagels and coffee. Emerging disease threats were growing, yet more than 80% of the world’s countries hadn’t met a 2012 International Health Regulations deadline to be able to detect and respond to epidemics.
  • The session led to the Global Health Security Agenda, launched by the U.S., the WHO and about 30 partners in early 2014, to help nations improve their capabilities within five years.
  • Money was tight. The U.S. was recovering from the 2008-09 financial crisis, and federal funding to help U.S. states and cities prepare and train for health emergencies was declining. Public-health departments had cut thousands of jobs, and outdated data systems weren’t replaced.
  • “It was a Hail Mary pass,” said Tom Frieden, who was director of the CDC from 2009 to 2017 and a force behind the creation of the GHSA. “We didn’t have any money.”
  • At the WHO, Dr. Fukuda was in charge of health security. When the Ebola outbreak was found in March 2014, he and his colleagues were already stretched, after budget cuts and amid other crises.
  • The United Nations created a special Ebola response mission that assumed the role normally played by the WHO. Mr. Obama sent the U.S. military to Liberia, underscoring the inability of international organizations to fully handle the problem.
  • It took the WHO until August to raise an international alarm about Ebola. By then, the epidemic was raging. It would become the largest Ebola epidemic in history, with at least 28,600 people infected, and more than 11,300 dead in 10 countries. The largest outbreak before that, in Uganda, had involved 425 cases.
  • Congress passed a $5.4 billion package in supplemental funds over five years, with about $1 billion going to the GHSA. The flood of money, along with aggressive contact tracing and other steps, helped bring the epidemic to a halt, though it took until mid-2016.
  • Global health experts and authorities called for changes at the WHO to strengthen epidemic response, and it created an emergencies program. The National Security Council warned that globalization and population growth “will lead to more pandemics,” and called for the U.S. to do more.
  • r. Carroll of USAID, who had visited West Africa during the crisis, and saw some health workers wrap themselves in garbage bags for protection, started conceiving of a Global Virome Project, to detect and sequence all the unknown viral species in mammals and avian populations on the planet.
  • Billionaire Bill Gates warned in a TED talk that an infectious disease pandemic posed a greater threat to the world than nuclear war, and urged world leaders to invest more in preparing for one. The Bill & Melinda Gates Foundation helped form a new initiative to finance vaccines for emerging infections, the Coalition for Epidemic Preparedness Innovations.
  • Congress established a permanent Infectious Diseases Rapid Response Fund for the CDC in fiscal 2019, with $50 million for that year and $85 million in fiscal 2020.
  • In May 2018, John Bolton, then President Trump’s national security adviser, dismantled an NSC unit that had focused on global health security and biodefense, with staff going to other units. The senior director of the unit left.
  • It pushed emerging disease threats down one level in the NSC hierarchy, making pandemics compete for attention with issues such as North Korea, said Beth Cameron, a previous senior director of the unit. She is now vice president for global biological policy and programs at the Nuclear Threat Initiative.
  • Deteriorating relations with China reduced Washington’s activities there just as researchers were becoming more certain of the threat from coronaviruses.
  • Dr. Carroll had earlier been ordered to suspend his emerging pandemic threats program in China.
  • Dr. Carroll pitched to USAID his Global Virome Project. USAID wasn’t interested, he said. He left USAID last year. A meeting that Dr. Carroll planned for last August with the Chinese CDC and Chinese Academy of Sciences to form a Chinese National Virome Project was postponed due to a bureaucratic hang-up. Plans to meet are now on hold, due to Covid-19.
Javier E

Opinion | Colleges Should Be More Than Just Vocational Schools - The New York Times - 0 views

  • Between 2013 and 2016, across the United States, 651 foreign language programs were closed, while majors in classics, the arts and religion have frequently been eliminated or, at larger schools, shrunk. The trend extends from small private schools like Marymount to the Ivy League and major public universities, and shows no sign of stopping.
  • The steady disinvestment in the liberal arts risks turning America’s universities into vocational schools narrowly focused on professional training. Increasingly, they have robust programs in subjects like business, nursing and computer science but less and less funding for and focus on departments of history, literature, philosophy, mathematics and theology.
  • America’s higher education system was founded on the liberal arts and the widespread understanding that mass access to art, culture, language and science were essential if America was to thrive. But a bipartisan coalition of politicians and university administrators is now hard at work attacking it — and its essential role in public life — by slashing funding, cutting back on tenure protections, ending faculty governance and imposing narrow ideological limits on what can and can’t be taught.
  • ...16 more annotations...
  • For decades — and particularly since the 2008 recession — politicians in both parties have mounted a strident campaign against government funding for the liberal arts. They express a growing disdain for any courses not explicitly tailored to the job market and outright contempt for the role the liberal arts-focused university has played in American society.
  • Former Gov. Scott Walker’s assault on higher education in Wisconsin formed the bedrock of many later conservative attacks. His work severely undermined a state university system that was once globally admired. Mr. Walker reportedly attempted to cut phrases like “the search for truth” and “public service” — as well as a call to improve “the human condition” — from the University of Wisconsin’s official mission statement
  • But blue states also regularly cut higher education funding, sometimes with similar rationales. In 2016, Matt Bevin, the Republican governor of Kentucky at the time, suggested that students majoring in the humanities shouldn’t receive state funding. The current secretary of education, Miguel Cardona, a Democrat, seems to barely disagree. “Every student should have access to an education that aligns with industry demands and evolves to meet the demands of tomorrow’s global work force,” he wrote in December.
  • Federal funding reflects those priorities. The National Endowment for the Humanities’ budget in 2022 was just $180 million. The National Science Foundation’s budget was about 50 times greater, having nearly doubled within two decades.
  • What were students meant to think? As the cost of higher education rose, substantially outpacing inflation since 1990, students followed funding — and what politicians repeatedly said about employability — into fields like business and computer science. Even majors in mathematics were hit by the focus on employability.
  • Universities took note and began culling. One recent study showed that history faculty across 28 Midwestern universities had dropped by almost 30 percent in roughly the past decade. Classics programs, including the only one at a historically Black college, were often simply eliminated.
  • Higher education, with broad study in the liberal arts, is meant to create not merely good workers but good citizens
  • this is a grim and narrow view of the purpose of higher education, merely as a tool to train workers as replaceable cogs in America’s economic machine, to generate raw material for its largest companies.
  • Citizens with knowledge of their history and culture are better equipped to lead and participate in a democratic society; learning in many different forms of knowledge teaches the humility necessary to accept other points of view in a pluralistic and increasingly globalized society.
  • In 1947, a presidential commission bemoaned an education system where a student “may have gained technical or professional training” while being “only incidentally, if at all, made ready for performing his duties as a man, a parent and a citizen.” The report recommended funding to give as many Americans as possible the sort of education that would “give to the student the values, attitudes, knowledge and skills that will equip him to live rightly and well in a free society,” which is to say the liberal arts as traditionally understood. The funding followed.
  • The report is true today, too
  • the American higher education system is returning to what it once was: liberal arts finishing schools for the wealthy and privileged, and vocational training for the rest.
  • Reversing this decline requires a concerted effort by both government and educational actors
  • renewed funding for the liberal arts — and especially the humanities — would support beleaguered departments and show students that this study is valuable and valued.
  • At the university level, instituting general education requirements would guarantee that even students whose majors have nothing to do with the humanities emerged from college equipped to think deeply and critically across disciplines.
  • Liberal arts professors must also be willing to leave their crumbling ivory towers and the parochial debates about their own career path, in order to engage directly in public life
Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

Scientists Predicted the Coronavirus Pandemic - The Atlantic - 0 views

  • The now-prophetic words could be found at the end of a research paper published in the journal Clinical Microbiology Reviews in October of 2007: “The presence of a large reservoir of SARS-CoV-like viruses in horseshoe bats, together with the culture of eating exotic mammals in southern China, is a time bomb.”
  • The warning—made nearly 13 years ago and more than four years after the worrying first wave of severe acute respiratory syndrome, or SARS, killed nearly 800 people globally—was among the earliest to predict the emergence of something like SARS-CoV-2, the virus behind the current COVID-19 pandemic.
  • ilar.”
  • ...25 more annotations...
  • Dogged by skepticism and inconsistent funding, these coronavirus researchers were stymied from developing treatments and vaccines for SARS—many of which could have been helpful in the current crisis.
  • Another similarly affected researcher was Brenda Hogue, a virologist at Arizona State University in Tempe. Hogue had devoted her career to studying coronaviruses, focusing on the protein machinery that drives their assembly. After SARS, she and her colleagues turned part of their attention toward developing a vaccine. But when the funding dropped off in 2008, she said, the vaccine went into limbo “and we put our efforts into other directions.”
  • to some experts whose business it is to hunt potential pathogens before they spill over into human populations, the many years spent not girding for a serious coronavirus outbreak were tragically—and unnecessarily—wasted.
  • “We were out there on the ground after SARS, working on coronaviruses with Chinese colleagues in collaboration,” said Peter Daszak, president of the EcoHealth Alliance, a New York–based nonprofit group that took part in a large federally funded effort, called Predict, to hunt for new pandemic viruses in wildlife in 31 countries, including China. That program was famously defunded last fall, just before the SARS-CoV-2 outbreak began.
  • “But we were the only group of western scientists,” Daszak added. “How can we be the only people looking for these viruses when there was such a clear and present danger?”
  • when SARS emerged in late 2002, there was initially “general disbelief among medical people that a coronavirus could be the basis of such a huge outbreak.”
  • As that epidemic spread, an influx of new researchers crowded the field. More grants were awarded, and funding started to climb. “Everyone wanted to know where the virus had come from,” said Ralph Baric, a microbiologist at the University of North Carolina’s Gillings School of Global Public Health. Initial findings pointed to wild civets and raccoon dogs sold for meat and pelts, respectively, in Chinese markets. Later evidence began to implicate horseshoe bats as the original source of the infections. Some researchers whose pre-SARS careers had been grounded in basic coronavirus biology began working on therapies and vaccines—and they made steady progress for several years.
  • funding declines hobbled individual investigators who weren’t part of these larger consortia. Pharmaceutical companies that develop vaccines and therapies scaled back on coronavirus research, too. Within a few years after the SARS outbreak, public health funding agencies both in the United States and abroad “no longer regarded coronaviruses as a high public health threat compared to other diseases,” Saif wrote in an email.
  • Then on May 12, The Wall Street Journal reported that the Chinese government was responding in kind, “by stalling international efforts to find the source of the [SARS-CoV-2] virus amid an escalating U.S. push to blame China for the pandemic.”
  • To demonstrate that a particular virus is actually harmful to people, scientists need to isolate and culture the microbe and show it infects human cells in the lab
  • Led by virologist Zheng-Li Shi, the Wuhan team reported in 2013 that this particular virus, called WIV1, binds with ACE2 in civet and human cells, and then replicates efficiently inside them. “That was the red flag,” Saif said. Earlier evidence suggested that direct contact with these bats could lead to viral spillover in humans. “Now there was proof of that.”
  • hen cases of those diseases fell off, public-health responders shifted to other viral emergencies such as Ebola and Zika, and coronavirus research funding dropped sharply.
  • They created a hybrid microbe by attaching the spike protein from SHC014 to the genetic backbone of a SARS-like virus that was previously adapted to grow in mice. Called a chimera—an organism containing cells with more than one genotype—this lab-made microbe had no problem binding with ACE2 and infecting human cells. Baric’s research team concluded that like WIV1, any SARS-like viruses outfitted with the SHC014 spike could pose cross-species threats to people.
  • Baric acknowledged the risky nature of the research but emphasized the safety protocols. “In general, we don’t know the transmissibility or virulence potential of any bat viruses or chimeras,” Baric said in an email message. “Hence it’s best to keep and work with them under biosafety level 3 laboratory conditions to maximize safety.”
  • Baric also pointed out that a chimera would display a genetic signature “that says what it is.” The adjoining parts of a chimera segregate discreetly in a logical pattern.
  • A genetic analysis of the chimera produced in his lab, for instance, “would come out to be mouse-adapted SARS everywhere but the spike, which is SHC014.” Similar logical patterns are absent in SARS-CoV-2, indicating that the virus that causes COVID-19 evolved naturally.
  • ven as Baric and others were generating lab evidence that more SARS-like viruses were poised for human emergence, another outbreak—in pigs, not people—provided another strong and recent signal: Some 25,000 piglets were killed by a coronavirus in the Guangdong province of China, starting in 2016. That virus, too, was found in horseshoe bats, and Buchmeier described the outbreak as both a major cross-species spillover and a warning shot that was never really picked up by the broader public-health community.
  • The EcoHealth Alliance, which had been part of the Predict effort, maintained its own collaboration with the Wuhan Institute of Virology using funds supplied by the National Institutes of Health. But on April 24, the Trump administration—which is investigating whether SARS-CoV-2 escaped accidentally from the Wuhan Institute, an allegation that’s been broadly discredited—directed the NIH to cut off that support.
  • The bats had been trapped in a cave in Kunming, the capital of the Yunnan province. At least seven other SARS-like strains were present in that same colony, leading the researchers to speculate that bat coronaviruses remained “a substantial global threat to public health.”
  • To disease experts, the bickering is a worrying—perhaps even astonishing—indicator that at least some global leaders still aren’t hearing what they have to say about the threat of coronaviruses, and Baric asserted that the ongoing pandemic exposes the need for better communication between countries, not less. “That is absolutely key,” he said. “Critical information needs to be passed as quickly as possible.”
  • Many other warnings would follow.Indeed, evidence of a looming and more deadly coronavirus pandemic had been building for years. Yet experts who specialize in coronaviruses—a large family of pathogens, found especially in birds and mammals, that can cross over to humans from other mammals and cause varying degrees of illness—struggled to convince a broader audience of the risk
  • the number of coronavirus-research grants funded by the National Institutes of Health—which had increased from a low of 28 in 2002 to a peak of 103 in 2008—went into a tailspin.
  • Though support for coronavirus research spiked a bit with the MERS outbreak in 2012, the increase was short-lived. Since that outbreak was quickly contained, the disease didn’t raise wider concerns and grant opportunities declined further.
  • Ironically, just as funding for drugs and vaccines was drying up, evidence that other coronavirus threats lurked in wildlife was only getting stronger
  • Ten years would pass, however, before researchers could show there were other SARS-like viruses in nature that also bind with ACE2. The evidence came from a team based at the Wuhan Institute of Virology
katherineharron

Stimulus negotiations: A deal is within reach. Can Hill leaders finally strike one? - C... - 0 views

  • With government funding running out Friday night, lawmakers have to release a massive, $1.4 trillion package as soon as Tuesday if it has any chance of passing Congress and keeping agencies from shutting down by the weekend.
  • struggling Americans could once again be disappointed if there's no agreement and they're forced to wait even longer as lawmakers continue to haggle.
  • House Speaker Nancy Pelosi has invited Senate Minority Leader Chuck Schumer, Senate Majority Leader Mitch McConnell, and House Minority Leader Kevin McCarthy to her office for a meeting on Covid and government funding. The meeting is scheduled to occur at 4 p.m. ET.
  • ...14 more annotations...
  • Congress may have to pass yet another short-term stopgap resolution to give them more time to find an agreement.
  • If a sweeping government funding bill is released without pandemic relief, that would spell serious trouble for the effort to pass Covid aid before Congress breaks for the holidays and could signal the impending demise of the last-ditch effort to secure a stimulus deal.
  • As of late Monday night, there still was no final consensus, with familiar sticking points: Democrats want state and local money to help ensure workers who provide vital services are not laid off. Republicans believe much of that money will be wasted. And the GOP lawmakers who are open to more state and local aid say there also need to be lawsuit protections for businesses and other entities, but Democrats argue that the GOP proposals on that idea go too far.
  • House and Senate appropriators are planning to unveil a $1.4 trillion spending bill Tuesday to fund federal agencies until the end of September 2021, which leaves little time before the Friday deadline for what's expected to be a massive package to pass both chambers.
  • It's clear to virtually everyone in Washington that a deal is within reach that includes several key provisions: An extension of jobless benefits, money for vaccine distribution, funds for schools, small business loans -- among a handful of other issues.
  • Self-imposed deadlines have a way of slipping in Congress and it's always possible lawmakers won't release a massive funding deal Tuesday despite their intention to do so. If that happens, it could mean that talks over both stimulus and government spending are breaking down and lawmakers may be forced to punt the issue further down the road by walking away from a pandemic stimulus deal during the lame duck session of Congress and passing a short-term funding patch rather than a far broader, comprehensive spending deal.
  • "Either 100 senators will be here shaking our heads, slinging blame and offering excuses about why we still have not been able to make a law -- or we will break for the holidays having sent another huge dose of relief out the door for the people who need it."
  • There were clear signs on Monday that Democrats could be forced to abandon a push for at least $160 billion in aid to cash-strapped states and cities in order to get a bipartisan agreement on some relief provisions.
  • during a 22-minute phone call Monday evening, the speaker told Mnuchin that the GOP insistence to include lawsuit protections for businesses and other entities "remain an obstacle" to getting an agreement on state and local aid -- since Republicans have demanded the two be tied together.
  • A bipartisan group of lawmakers unveiled the legislative text of a $908 billion compromise Covid relief plan on Monday
  • If the aid is ultimately dropped from the plan, it would amount to a major concession from Democrats, who had advanced roughly $1 trillion for aid to states and cities as part of a $3 trillion-plus plan that passed the House in May and that the Senate never considered. Democrats had argued the money was paramount to ensure that workers performing vital services -- ranging from first responders to health care workers -- could continue to say on the job.
  • If Democrats do drop their demand for state and local aid, the consensus bill put forward by the bipartisan coalition on Monday that sidesteps that issue as well as liability protections could serve as a ready-made starting point for what could be agreed to more widely on Covid relief.That bill has a price tag of $748 billion and includes policy ideas that have proven popular across party lines such as a boost to the Paycheck Protection Program
  • "I am convinced the majority leader will actually bring legislation to the floor that will either take up our $748 billion bill or the total of $908 billion, or perhaps he will pick and choose from what we put together in a bill of his own and attach it to the omnibus spending bill."
  • According to a summary released on Monday, the bill would provide $300 billion for the Small Business Administration and funds that would give small businesses the chance to benefit from another loan through the PPP with certain eligibility restrictions.There would be $2.58 billion for CDC vaccine distribution and infrastructure and an extension of pandemic unemployment insurance programs for 16 weeks along with a $300 per week expansion of federal supplemental unemployment insurance benefits
Javier E

Jack Bogle: The Undisputed Champion of the Long Run - WSJ - 0 views

  • Jack Bogle is ready to declare victory. Four decades ago, a mutual-fund industry graybeard warned him that he would “destroy the industry.” Mr. Bogle’s plan was to create a new mutual-fund company owned not by the founding entrepreneur and his partners but by the shareholders of the funds themselves. This would keep overhead low for investors, as would a second part of his plan: an index fund that would mimic the performance of the overall stock market rather than pay genius managers to guess which stocks might go up or down.
  • Not even Warren Buffett has minted more millionaires than Jack Bogle has—and he did so not by helping them get lucky, but by teaching them how to earn the market’s long-run, average return without paying big fees to Wall Street.
  • The mutual-fund industry is slowly liquidating itself—except for Vanguard. Mr. Bogle happily supplies the numbers: During the 12 months that ended May 31, “the fund industry took in $87 billion . . . of which $224 billion came into Vanguard.” In other words, “in the aggregate, our competitors experienced capital outflows of $137 billion.”
  • ...11 more annotations...
  • Mr. Bogle has some hard news for investors. The basic appeal of index funds—their ability to deliver the market return without shifting an arm and leg to Wall Street’s army of helpers—will only become more important given the decade of depressed returns he sees ahead.
  • Don’t imagine a revisitation of the ’80s or ’90s, when stocks returned 18% a year and investors, after the industry’s rake-off, imagined they “had the greatest manager in the world” because they got 14%. Those planning on a comfy retirement or putting a kid through college will have to save more, work to keep costs low, and—above all—stick to the plan.
  • “When the climate really gets bad, I’m not some statue out there. But when I get knots in my stomach, I say to myself, ‘Reread your books,’ ” he says. Mr. Bogle has written numerous advice books on investing, including 2007’s “The Little Book of Common Sense Investing,” which remains a perennial Amazon best seller—and all of them emphasize not trying to outguess the markets.
  • That said, Mr. Bogle finds today’s stock scene puzzling. Shares are highly priced in historical terms; earnings and economic growth he expects to disappoint for at least the next decade (he sees no point in trying to forecast further). And yet he advises investors to stay invested and weather the storm: “If we’re going to have lower returns, well, the worst thing you can do is reach for more yield. You just have to save more.”
  • Mr. Bogle relies on a forecasting model he published 25 years ago, which tells him that investors over the next decade, thanks largely to a reversion to the mean in valuations, will be lucky to clear 2% annually after costs. Yuck.
  • Then why invest at all? Maybe it would be better to sell and stick the cash in a bank or a mattress. “I know of no better way to guarantee you’ll have nothing at the end of the trail,” he responds. “So we know we have to invest. And there’s no better way to invest than a diversified list of stocks and bonds at very low cost.”
  • Mr. Bogle’s own portfolio consists of 50% stocks and 50% bonds, the latter tilted toward short- and medium-term. Keep an eagle eye on costs, he says, in a world where pre-cost returns may be as low as 3% or 4%. Inattentive investors can expect to lose as much as 70% of their profits to “hidden” fund management costs in addition to the “expense ratios” touted in mutual-fund prospectuses. (These hidden costs include things like sales load, transaction costs, idle cash and inefficient taxes.)
  • He also knows the heartache of having just about everything he has saved tied up in volatile, sometimes irrational markets, especially now. “We’re in a difficult place,” he says. “We live in an extremely risky world—probably more risky than I can recall.”
  • Investing, he says, always is “an act of trust—in the ability of civilization and the U.S. to continue to flourish; in the ability of corporations to continue, through efficiency and entrepreneurship and innovation, to provide substantial returns.” But nothing, not even American greatness, is guaranteed, he adds
  • what he calls the financial buccaneer type, an entrepreneur more interested in milking what’s left of the active-management-fee gravy train than in providing low-cost competition for Vanguard—which means Vanguard’s best days as guardian of America’s nest egg may still lie ahead.
  • the growth of indexing is obviously unwelcome writing on the wall for Wall Street professionals and Vanguard’s profit-making competitors like Fidelity, which have never been able to give heart and soul to low-churn indexing because indexing doesn’t generate large fees for executives and shareholders of management companies.
Javier E

Can We Be Brutally Honest About Investment Returns? - MoneyBeat - WSJ - 0 views

  • Pension funds have fantastical expectations of the market
  • With U.S. stocks at all-time highs, it’s more important than ever that investors be brutally realistic about future returns.
  • You can learn a lot from these folks — if you listen to them and then do the opposite.
  • ...11 more annotations...
  • A new study by finance professors Aleksandar Andonov of Erasmus University Rotterdam and Joshua Rauh of Stanford University looks at expected returns among more than 230 public pension plans with more than $2.8 trillion in combined assets.
  • For their portfolios, generally consisting of cash, U.S. and international bonds and stocks, real estate, hedge funds and private-equity or buyout funds, these pension plans report that they will earn an average of 7.6% annually over the long term. (That’s 4.8% after their estimates of inflation.) These funds often define “long term” as between 10 and 30 years.
  • Based on how they divvy up their money, how much are these pension funds assuming specific assets will earn?
  • They expect cash to return an average of 3.2% annually over the long run; bonds, 4.9%; such “real assets” as commodities and real estate, 7.7%; hedge funds, 6.9%; publicly traded stocks, 8.7%; private-equity funds, 10.3%.
  • consider bonds. The simplest reliable indicator of how much you will earn from a portfolio of bonds in the future is their yield to maturity in the present. With 10-year Treasurys yielding 2.6% and investment-grade corporate bonds averaging under 3.7%, it would take a near-miracle today to get anything close to 4% out of a high-quality fixed-income portfolio.
  • That’s below the U.S. average of 10.2% annually over the past 90 years. But stocks were far cheaper over most of that period than they are today, so their returns were naturally higher.
  • stocks aren’t likely to earn more than an average of 5.9% annually over the long run from today’s lofty prices.
  • Among those, the least implausible scenario is higher inflation. So the pension funds could hit their 8.7% stock return that way — but such a surge in the cost of living would crimp their bond returns. What they would gain on their stocks they would lose on their bonds.
  • the new study of estimated returns finds that the older a pension fund’s holdings of private equity are, the more likely its officials are to extrapolate those returns — as if the good times of the early 2000s, when deals abounded and buyouts were cheaper, were still rolling.
  • Why do expectations among pension plans run so high? Because they have to, the chief investment officer of a large public pension plan tells me. State laws guarantee generous retirement benefits for millions of current and former government employees. To appear as if they can meet those obligations, the pension plans have no choice but to set their expected returns higher than reality is likely to deliver.
  • That’s the exact opposite of what the rest of us should do. Sooner or later, investors who build their expectations on hope rather than on arithmetic end up sorry.
Javier E

The Bezos Earth Fund Indicates a National Failing - The Atlantic - 0 views

  • Jeff Bezos announced the creation of the Bezos Earth Fund, which will disperse $10 billion in the name of combatting climate change. The fund is a triumph of philanthropy—and a perfect emblem of a national failing.
  • In a healthy democracy, the world’s richest man wouldn’t be able to painlessly make a $10 billion donation. His fortune would be mitigated by the tax collector; antitrust laws would constrain the growth of his business. Instead of relying on a tycoon to bankroll the national response to an existential crisis, there would be a national response.
  • in an age of political dysfunction, Bezos has begun to subsume the powers of the state. Where the government once funded the ambitious exploration of space, Bezos is leading that project, spending a billion dollars each year to build rockets and rovers. His company, Amazon, is spearheading an experimental effort to fix American health care; it will also spend $700 million to retrain workers in the shadow of automation and displacement.
  • ...9 more annotations...
  • Bezos is providing the vital infrastructure of state. When Amazon locates its second headquarters on the Potomac, staring across the river at the capital, it will provide a perfect geographic encapsulation of the new balance of power.
  • there is no meaningful public oversight of Bezos’s power. His investments and donations—not to mention the dominance of his sprawling firm and his ownership of one of the nation’s most important newspapers—give him an outsize role in shaping the human future.
  • There’s no clear sense of the projects it will bankroll, even though a contribution of that scale will inevitably set the agenda of academics and nongovernmental organizations.
  • Bezos’s personal biases—his penchant for technological solutions, his skepticism of government regulation—will likely shape how the Bezos Earth Fund disperses cash. And that will, in turn, shape how activists and researchers craft their grant proposals, how they attempt to please a funder who can float their operations.
  • Even if Amazon aims to slash its own emissions, it’s creating an economy that seems likely to undermine its stated goal of carbon neutrality. A reasonable debate about planetary future would at least question the wisdom of the same-day delivery of plastic tchotchkes made in China.
  • Then there are the policies that permit companies, like Amazon, to pay virtually nothing in taxes—revenue that would ideally fund, say, a Green New Deal. It hardly seems likely that the Bezos Earth Foundation will seek to erode the very basis of the fortune that funds it.
  • A skeptical response to the Bezos Earth Fund doesn’t preclude the hope that it will do real good. Michael Bloomberg’s climate philanthropy has played an important role in shutting down coal-fired power plants.
  • In these years of polarization and dysfunction, the public keeps turning to saviors who present themselves as outsiders and promise transformation. Trump, of course, billed himself as this sort of salvific figure. But instead of curing voters of this yearning, he seems to have exacerbated it. The current temptation comes in the form of billionaires who exude competence
  • That the public seems indifferent to the dangers of a growing plutocracy is perhaps the greatest national failing of them all.
yehbru

Biden budget would give CDC biggest funding boost in nearly 20 years - 0 views

  • The budget blueprint for fiscal 2022 would include $8.7 billion in discretionary funding for the Centers for Disease Control and Prevention, according to budget documents shared by the Office of Management and Budget.
  • The agency said that budget bump would build on the CDC investments doled out in the American Rescue Plan, the $1.9 trillion Covid relief plan that Biden signed into law in March.
  • The new funding would be used to “support core public health capacity improvements in States and Territories, modernize public health data collection nationwide, train new epidemiologists and other public health experts, and rebuild international capacity to detect, prepare for, and respond to emerging global threats,” the OMB said.
  • ...4 more annotations...
  • While the CDC funding request is a big increase from recent years, it comprises just a small slice of Biden’s $6 trillion budget proposal for 2022.
  • The budget materials say $153 million would be allocated for the CDC’s Social Determinants of Health program to work on “improving health equity and data collection for racial and ethnic populations.”
  • The government would also provide $100 million for the CDC’s Climate and Health program as part of a $1.2 billion investment in strengthening resilience to wildfires, floods, droughts and other climate-related disasters.
  • Overall, HHS is requesting $133.7 billion in discretionary funding — a $25.3 billion, or 23.4%, bump from the enacted budget of fiscal 2021.
Javier E

The nation's public health agencies are ailing when they're needed most - The Washingto... - 0 views

  • At the very moment the United States needed its public health infrastructure the most, many local health departments had all but crumbled, proving ill-equipped to carry out basic functions let alone serve as the last line of defense against the most acute threat to the nation’s health in generations.
  • Epidemiologists, academics and local health officials across the country say the nation’s public health system is one of many weaknesses that continue to leave the United States poorly prepared to handle the coronavirus pandemic
  • That system lacks financial resources. It is losing staff by the day.
  • ...31 more annotations...
  • Even before the pandemic struck, local public health agencies had lost almost a quarter of their overall workforce since 2008 — a reduction of almost 60,000 workers
  • The agencies’ main source of federal funding — the Centers for Disease Control and Prevention’s emergency preparedness budget — had been cut 30 percent since 2003. The Trump administration had proposed slicing even deeper.
  • According to David Himmelstein of the CUNY School of Public Health, global consensus is that, at minimum, 6 percent of a nation’s health spending should be devoted to public health efforts. The United States, he said, has never spent more than half that much.
  • the problems have been left to fester.
  • Delaware County, Pa., a heavily populated Philadelphia suburb, did not even have a public health department when the pandemic struck and had to rely on a neighbor to mount a response.
  • With plunging tax receipts straining local government budgets, public health agencies confront the possibility of further cuts in an economy gutted by the coronavirus. It is happening at a time when health departments are being asked to do more than ever.
  • While the country spends roughly $3.6 trillion every year on health, less than 3 percent of that spending goes to public health and prevention
  • “Why an ongoing government function should depend on episodic grants rather than consistent funding, I don’t know,” he added. “That would be like seeing that the military is going to apply for a grant for its regular ongoing activities.”
  • Compared with Canada, the United Kingdom and northern European countries, the United States — with a less generous social safety net and no universal health care — is investing less in a system that its people rely on more.
  • Himmelstein said that the United States has never placed much emphasis on public health spending but that the investment began to decline even further in the early 2000s. The Great Recession fueled further cuts.
  • Plus, the U.S. public health system relies heavily on federal grants.
  • “That’s the way we run much of our public health activity for local health departments. You apply to the CDC, which is the major conduit for federal funding to state and local health departments,” Himmelstein said. “You apply to them for funding for particular functions, and if you don’t get the grant, you don’t have the funding for that.”
  • Many public health officials say a lack of a national message and approach to the pandemic has undermined their credibility and opened them up to criticism.
  • Few places were less prepared for covid-19’s arrival than Delaware County, Pa., where Republican leaders had decided they did not need a public health department at all
  • At the same time, many countries that invest more in public health infrastructure also provide universal medical coverage that enables them to provide many common public health services as part of their main health-care-delivery system.
  • Taylor and other elected officials worked out a deal with neighboring Chester County in which Delaware County paid affluent Chester County’s health department to handle coronavirus operations for both counties for now.
  • One reason health departments are so often neglected is their work focuses on prevention — of outbreaks, sexually transmitted diseases, smoking-related illnesses. Local health departments describe a frustrating cycle: The more successful they are, the less visible problems are and the less funding they receive. Often, that sets the stage for problems to explode again — as infectious diseases often do.
  • It has taken years for many agencies to rebuild budgets and staffing from deep cuts made during the last recessio
  • During the past decade, many local health departments have seen annual rounds of cuts, punctuated with one-time infusions of money following crises such as outbreaks of Zika, Ebola, measles and hepatitis. The problem with that cycle of feast or famine funding is that the short-term money quickly dries up and does nothing to address long-term preparedness.
  • “It’s a silly strategic approach when you think about what’s needed to protect us long term,”
  • She compared the country’s public health system to a house with deep cracks in the foundation. The emergency surges of funding are superficial repairs that leave those cracks unaddressed.
  • “We came into this pandemic at a severe deficit and are still without a strategic goal to build back that infrastructure. We need to learn from our mistakes,”
  • With the economy tanking, the tax bases for cities and counties have shrunken dramatically — payroll taxes, sales taxes, city taxes. Many departments have started cutting staff. Federal grants are no sure thing.
  • 80 percent of counties have reported their budget was affected in the current fiscal year because of the crisis. Prospects are even more dire for future budget periods, when the full impact of reduced tax revenue will become evident.
  • Christine Hahn, medical director for Idaho’s division of public health and a 25-year public health veteran, has seen the state make progress in coronavirus testing and awareness. But like so many public health officials across the country taking local steps to deal with what has become a national problem, she is limited by how much government leaders say she can do and by what citizens are willing to do.
  • “I’ve been through SARS, the 2009 pandemic, the anthrax attacks, and of course I’m in rural Idaho, not New York City and California,” Hahn said. “But I will say this is way beyond anything I’ve ever experienced as far as stress, workload, complexity, frustration, media and public interest, individual citizens really feeling very strongly about what we’re doing and not doing.”
  • “I think the general population didn’t really realize we didn’t have a health department. They just kind of assumed that was one of those government agencies we had,” Taylor said. “Then the pandemic hit, and everyone was like, ‘Wait, hold on — we don’t have a health department? Why don’t we have a health department?’ ”
  • “People locally are looking to see what’s happening in other states, and we’re constantly having to talk about that and address that,”
  • “I’m mindful of the credibility of our messaging as people say, ‘What about what they’re doing in this place? Why are we not doing what they’re doing?’ ”
  • Many health experts worry the challenges will multiply in the fall with the arrival of flu season.
  • “The unfolding tragedy here is we need people to see local public health officials as heroes in the same way that we laud heart surgeons and emergency room doctors,” Westergaard, the Wisconsin epidemiologist, said. “The work keeps getting higher, and they’re falling behind — and not feeling appreciated by their communities.”
lilyrashkind

The US Funded Universal Childcare During World War II-Then Stopped - HISTORY - 0 views

  • When the United States started recruiting women for World War II factory jobs, there was a reluctance to call stay-at-home mothers with young children into the workforce. That changed when the government realized it needed more wartime laborers in its factories. To allow more women to work, the government began subsidizing childcare for the first (and only) time in the nation’s history.
  • Before World War II, organized “day care” didn’t really exist in the United States. The children of middle- and upper-class families might go to private nursery schools for a few hours a day, says Sonya Michel, a professor emerita of history, women’s studies and American studies at the University of Maryland-College Park and author of Children’s Interests/Mothers’ Rights: The Shaping of America’s Child Care Policy. (In German communities, five- and six-year-olds went to half-day Kindergartens.)
  • Defense Housing and Community Facilities and Services Act, known as the Lanham Act, which gave the Federal Works Agency the authority to fund the construction of houses, schools and other infrastructure for laborers in the growing defense industry. It was not specifically meant to fund childcare, but in late 1942, the government used it to fund temporary day care centers for the children of mothers working wartime jobs.
  • ...7 more annotations...
  • They also provided up to three meals a day for children, with some offering prepared meals for mothers to take with them when they picked up their kids.
  • “The ones that we often hear about were the ‘model’ day nurseries that were set up at airplane factories [on the West coast],” says Michel. “Those were ones where the federal funding came very quickly, and some of the leading voices in the early childhood education movement…became quickly involved in setting [them] up,” she says. 
  • This was during the Cold War, a time when anti-childcare activists pointed to the fact that the Soviet Union funded childcare as an argument for why the United States shouldn’t. President Richard Nixon vetoed the bill, arguing that it would “commit the vast moral authority of the National Government to the side of communal approaches to child rearing over against the family-centered approach.”
  • When the World War II childcare centers first opened, many women were reluctant to hand their children over to them. According to Chris M. Herbst, a professor of public affairs at Arizona State University who has written about these programs in the Journal of Labor Economics, a lot of these women ended up having positive experiences.
  • As the war ended in August 1945, the Federal Works Agency announced it would stop funding childcare as soon as possible. Parents responded by sending the agency 1,155 letters, 318 wires, 794 postcards and petitions with 3,647 signatures urging the government to keep them open. In response, the U.S. government provided additional funding for childcare through February 1946. After that, it was over.
  • For these centers, organizers enlisted architects to build attractive buildings that would cater to the needs of childcare, specifically. “There was a lot of publicity about those, but those were unusual. Most of the childcare centers were kind of makeshift. They were set up in [places like] church basements.”
  • U.S. history that the country came close to instituting universal childcare.
Javier E

The Plight of the Overworked Nonprofit Employee - The Atlantic - 0 views

  • Many nonprofit organizations stare down a shared set of challenges: In a 2013 report, the Urban Institute surveyed over 4,000 nonprofits of a wide range of types and sizes across the continental U.S. It found that all kinds of nonprofits struggled with delays in payment for contracts, difficulty securing funding for the full cost of their services, and other financial issues.
  • Recent years have been especially hard for many nonprofits. Most have annual budgets of less than $1 million, and those budgets took a big hit from the recession, when federal, municipal, and philanthropic funding dried up. On top of that, because so many nonprofits depend on government money, policy changes can cause funding priorities to change, which in turn can put nonprofits in a bind.
  • The pressure from funders to tighten budgets and cut costs can produce what researchers call the “nonprofit starvation cycle.” The cycle starts with funders’ unrealistic expectations about the costs of running a nonprofit. In response, nonprofits try to spend less on overhead (like salaries) and under-report expenses to try to meet those unrealistic expectations. That response then reinforces the unrealistic expectations that began the cycle. In this light, it’s no surprise that so many nonprofits have come to rely on unpaid work.
  • ...15 more annotations...
  • Strangely, though nonprofits are increasingly expected to perform like businesses, they do not get the same leeway in funding that government-contracted businesses do. They don’t have nearly the bargaining power of big corporations, or the ability to raise costs for their products and services, because of tight controls on grant funding. “D.C. is full of millionaires who contract with government in the defense field, and they make a killing, and yet if you’re a nonprofit, chances are you aren’t getting the full amount of funding to cover the cost of the services required,” Iliff said. “Can you imagine Lockheed Martin or Boeing putting up with a government contract that didn’t allow for overhead?”
  • When faced with dwindling funding, one response would be to cut a program or reduce the number of people an organization serves. But nonprofit leaders have shown themselves very reluctant to do that. Instead, many meet financial challenges by squeezing more work out of their staffs without a proportional increase in their pay:
  • nonprofits like PIRG, for example, have a tradition of forcing employees to work long, unpaid hours—especially their youngest staff. “There’s a culture that says, ‘Young people are paying their dues. It’s okay for them to be paid for fewer hours than they’re actually working because it’s in the effort of helping them grow up and contribute to something greater than they are,’” Boris says.
  • These nonprofit employees are saying that their operations depend on large numbers of their lowest-paid staff working unpaid overtime hours. One way to get  to that point would be to face a series of choices between increased productivity on the one hand and reduced hours, increased pay, or more hiring on the other, and to choose more productivity every time. That some nonprofits have done this speaks to a culture that can put the needs of staff behind mission-driven ambitions.
  • In the 1970s, 62 percent of full-time, salaried workers qualified for mandatory overtime pay when they worked more than 40 hours in a week. Today, because the overtime rules have not had a major update since then (until this one), only 7 percent of workers are covered, whether they work in the nonprofit sector or elsewhere. In other words, U.S. organizations—nonprofit or otherwise—have been given the gift of a large pool of laborers who, as long as they clear a relatively low earnings threshold and do tasks that meet certain criteria, do not have to be paid overtime.
  • Unsurprisingly, many nonprofits have taken advantage of that pool of free work. (For-profit companies have too, but they also have the benefit of being more in control of their revenue streams.) B
  • “There is this feeling that the mission is so important that nothing should get in the way of it,”
  • “Too often, I have seen the passion for social change turned into a weapon against the very people who do much—if not most—of the hard work, and put in most of the hours,” Hastings recently wrote on her blog. “Because they are highly motivated by passion, the reasoning goes, they don’t need to be motivated by decent salaries or sustainable work hours or overtime pay.”
  • A 2011 survey of more than 2,000 nonprofit employees by Opportunity Knocks, a human-resources organization that specializes in nonprofits, in partnership with Jessica Word, an associate professor of public administration at the University of Nevada, Las Vegas, found that half of employees in the nonprofit sector may be burned out or in danger of burnout.
  • . “These are highly emotional and difficult jobs,” she said, adding, “These organizations often have very high rates of employee turnover, which results from a combination of burnout and low compensation.” Despite the dearth of research, Word’s findings don’t appear to be unusual: A more recent study of nonprofits in the U.S. and Canada found that turnover, one possible indicator of burnout, is higher in nonprofits than in the overall labor market.
  • for all their hours and emotional labor, nonprofit employees generally don’t make much money. A 2014 study by Third Sector New England, a resource center for nonprofits, found that 43 percent of nonprofit employees in New England were making less than $28,000 per year—far less than a living wage for families with children in most cities in the United States, and well below the national median income of between $40,000 and $50,000 per year.
  • Why would nonprofit workers be willing to stay in jobs where they are underpaid, or, in some cases, accept working conditions that violate the spirit of the labor laws that protect them? One plausible reason is that they are just as committed to the cause as their superiors
  • But it also might be that some nonprofits exploit gray areas in the law to cut costs. For instance, only workers who are labelled as managers are supposed to be exempt from overtime, but many employers stretch the definition of “manager” far beyond its original intent.
  • even regardless of these designations, the emotionally demanding work at many nonprofits is sometimes difficult to shoehorn into a tidy 40-hours-a-week schedule. Consider Elle Roberts, who was considered exempt from overtime restrictions and was told not to work more than 40 hours a week when, as a young college grad, she worked at a domestic-violence shelter in northwest Indiana. Doing everything from home visits to intake at the shelter, Roberts still ignored her employer’s dictates and regularly worked well more than 40 hours a week providing relief for women in crisis. Yet she was not paid for that extra time.
  • “The unspoken expectation is that you do whatever it takes to get whatever it is done for the people that you’re serving,” she says. “And anything less than that, you’re not quite doing enough.
izzerios

Trump reverses abortion-related U.S. policy, bans funding to international health group... - 0 views

  • a rule is back in effect to block U.S. international family-planning assistance to foreign organizations that use funds from other sources to perform or discuss abortions.
  • einstating a rule first instituted by President Reagan
  • Stop providing abortions, or any information about abortions, or lose valuable dollars from the United States, the biggest global funder of family-planning services.
  • ...15 more annotations...
  • denunciations from family-planning groups and their Democratic allies and praise from pro-life officials and Republicans.
  • officially known as the “Mexico City policy” and referred to as “the global gag rule” by its critics
  • repealed and reinstated every time a different political party has assumed power in the White House.
  • Even when the rule has not been in effect, however, existing federal law has barred the use of U.S. funds to pay for abortions anywhere in the world.
  • sparked a flurry of angry responses from Democratic lawmakers and women’s health organizations.
  • “We won’t go back to coat hanger medicine.”
  • Pelosi said in a statement that Trump’s order “returns us to disgraceful era that dishonored the American values of free speech and inflicted untold suffering on millions of women around the world.”
  • the group said, it received $30 million in aid from the United States Agency for International Development
  • that implementation of the Mexico City policy was linked to increases in abortion rates in sub-Saharan African countries
  • couldn’t draw “definitive conclusions about the underlying cause of this increase.”
  • “This is a vital step in the journey to make America great again, recognizing and affirming the universal ideal that all human beings have inherent worth and dignity, regardless of their age or nationality,”said Tony Perkins
  • “Funding foreign groups that promote or participate in abortion violates the principle that there should be a ‘wall of separation’ between taxpayer money and abortion,”
  • Trump has started to “make good on his promises to the millions of pro-life Americans that helped him ascend to this office.”
  • “marked an expansion of existing legislative restrictions that already prohibited U.S. funding for abortion internationally,”
  • the foundation said, NGOs could use non-U.S. funds to engage in abortion-related activities as long as they kept separate accounts for any U.S. money received.
maddieireland334

Utah Senator says Flint doesn't need aid, blocks lead bill - 0 views

  • A Republican U.S. senator from Utah is holding up a federal funding package worth more than $100 million which could help address the issue of high lead levels found in Flint’s water, saying in a statement on Friday that no federal aid is needed at this time.
  • U.S. Sens. Debbie Stabenow and Gary Peters, both D-Mich., represented a “federalizing” of water infrastructure, objected to the bill, arguing the state has not directly asked Congress for any emergency spending and has its own surplus to spend if it needs money.
  • Stabenow, who worked with Peters for weeks to secure a group of Republican and Democratic cosponsors for the legislation, expressed surprise that Lee has placed a hold on the measure, which effectively keeps the Senate from voting on it, even though it is fully paid for.
  • ...6 more annotations...
  • Stabenow and Peters say federal funding is needed to replace aging pipes in Flint and other parts of the infrastructure to ensure public safety and restore confidence in the water system in the Michigan city.
  • It would authorize the federal Drinking Water State Revolving Fund to make up to $100 million in grants between now and October 2017 "to any state that receives an emergency declaration ... to a public health threat from lead or other contaminants in a public drinking water system."
  • The bill also authorizes $50 million for public health — though that funding is not specific to Flint — including $17.5 million to monitor the health effects of lead contamination in municipal water, along with allowing Michigan to use other funding to repay earlier federal loans taken out by Flint for work on its water system.
  • While Lee said, however, that Gov. Rick Snyder hadn’t asked Congress to authorize emergency funds, that misses some of the nuances of the situation in Flint, where a lack of corrosion-control treatment when the city switched to the Flint River in April 2014 allowed lead to leach from aging water pipes.
  • Snyder initially requested that President Barack Obama declare a major disaster in Flint and provide more than $700 million for infrastructure repairs to pipes and the water system. But Obama turned down that request because federal law only allows for such declarations in the cases of natural disasters, fires or explosions.
  • “What is happening to the people of Flint, Michigan is a man-made disaster,” Lee said. “Congress has special mechanisms for emergency spending when it is needed. But to date Michigan’s governor has not asked us for any, nor have Michigan’s senators proposed any. Contrary to media reports, there is no federal ‘aid package’ for Flint even being considered.”
katyshannon

Texas Eliminates Funding to Planned Parenthood | TIME - 0 views

  • Texas is eliminating taxpayer funding for Planned Parenthood in the state, based on concerns over the organization’s fetal tissue donations, the governor announced on Monday.
  • Texas Governor Greg Abbott’s office issued a statement, saying Texas is ending Medicaid’s participation with Planned Parenthood, which means the women’s reproductive health nonprofit will no longer receive Medicaid funding in the state.
  • “Following the release of gruesome videos filmed at Planned Parenthood facilities, including Gulf Coast in Texas, Governor Greg Abbott announced his LIFE Initiative to provide greater protections for children in the womb and prevent the sale of baby body parts,” Abbott’s statement reads. The cancellation “calls for funding to Planned Parenthood and other abortion providers out of taxpayer money to be eliminated completely, both at the State and local levels,” according to the announcement.
  • ...2 more annotations...
  • Abbott said that canceling Medicaid participation for Planned Parenthood in Texas moves the state closer to “providing greater access to safe healthcare for women while protecting our most vulnerable – the unborn.”
  • Planned Parenthood has long said that its fetal tissue donations are legal, since the organization did not accept payment for the donations beyond the reimbursement of costs. On Oct. 13, Planned Parenthood announced that it will no longer accept reimbursements for expenses related to donated fetal tissue. “Planned Parenthood’s policies on fetal tissue donation already exceed the legal requirements,” Planned Parenthood President Cecile Richards wrote in a letter to the National Institutes of Health. “Now we’re going even further in order to take away any basis for attacking Planned Parenthood to advance an anti-abortion political agenda.”
  •  
    Texas cuts funding to Planned Parenthood
1 - 20 of 1084 Next › Last »
Showing 20 items per page